Virginia's VALLSS 2024-25 Fall-Spring Growth Report

Programming note: I’m back after a long summer hiatus. I honestly just kind of lost motivation to do weekly posts. So, moving forward, I’m not going to promise weekly posts. I’ll shoot for a 1 or 2 times a month, or whenever something strikes me that I want to write about.


I just read the Virginia Language and Literacy Screening System (VALLSS) 2024-25 fall-to-spring growth report, which was sent out by the Virginia Department of Education (VDOE) about a week or so ago.

For my non-Virginia friends, or for those in Virginia who aren’t familiar with VALLSS, it’s a literacy screener for students in grades K-3 (note that assessments for 4-8 are being rolled out this year) intended to identify students who are at risk for reading problems. Students who are classified in the “high risk” band – based on their scores on the VALLSS screener – are required to receive additional reading interventions.

Students in grades K-3 take VALLSS screeners three times per year: in the fall, mid-year (winter), and spring, although 2nd and 3rd grade students scoring in the “low-risk” band at mid-year don’t have to take the spring assessment. I think.

Anyway. The point of this post isn’t to go deep into VALLSS. I want to focus on the report mentioned earlier.

Overall, I think the report is…fine? I mean, it’s not like we should expect anything salacious or hugely revelatory from a report describing a mandatory statewide assessment.

That said, there is one component of the report that I took issue with. In multiple places in the report, the authors noted that “students made greater progress from Fall to Mid-Year than from Mid-Year to Spring.”

I’ll preface everything I write here by admitting that my expertise is not in early literacy. I know a little bit – mostly from me asking my more-knowledgeable colleagues about what I should be doing at home with my 5 year old. But I do have a pretty thorough background in assessment and statistics, so that’s the lens I’m going to review this with. If there are early-literacy-based reasons why all of this is wrong, though, please let me know. I’m happy to be corrected!

So let’s go through a few reasons why I’m questioning this notion of students gaining more from fall to midyear than from midyear to spring.

First, the authors’ evidence for this claim is based on the percentages of students moving from “high risk” bands to “moderate risk” bands. Any claims about student growth that are derived from students moving between ordinal categories is suspect. In short, the VALLSS risk bands comprise a whole range of scores. For example, any kindergartener who scores between 433-527 on the fall assessment is classified in the “high risk” band, whereas the “high risk” band for kindergarteners on the mid-year assessment is 448-566. So Student A could score 527 on the fall assessment and 567 on the mid-year assessment – growing by 40 points from fall to mid-year – and move from “high risk” to “moderate risk,” whereas Student B could score 450 in the fall and 565 on the mid-year assessment – growing by 115 points – and not move out of the “high risk” band.

I'll use an analogy to make the same point. Imagine that running a sub-20-minute 5k classifies you as “fast” and running anything slower than 20 minutes classifies you as “not fast,” at least in this arbitrary hypothetical. It would be unreasonable to say that someone who improved from a 20:05 5k to a 19:55 5k – and thus crossed the 20-minute threshold – made “more progress” than someone who improved from 25:00 to 20:30. Using movement between ordered categories as a measure of "progress" leaves a lot to be desired, particularly when we have a continuous variable better suited to measuring progress that we're opting not to use!

Second, it seems very reasonable to me to assume that students’ fall scores will be slightly depressed due to the “summer slide” – the phenomenon where students “lose” some of the previous academic year’s gains over the summer. Assuming that students recover these losses more quickly than they learn new material, we would absolutely expect them to “gain” more from fall to midyear than from midyear to spring, since they'll be doing most of this recovery in the fall.

Third, if you look at the actual scaled score thresholds for the all of the risk bands (high, moderate, and low risk) from the spring, mid-year, and fall VALLSS reports, we can see that the gaps between the low-end of each of the thresholds are asymmetrical. Let’s consider the lowest-possible “moderate” risk score for kindergarten students. In each test window, these are:

  • Fall = 528
  • Mid = 567
  • Spring = 611

If we do a little bit of arithmetic, we can see that the difference between midyear and fall is 39 points, and the difference between spring and mid-year is 44 points. Again, I’ll admit that my understanding of early literacy development is…incomplete, so it’s very possible that this sort of nonlinear growth is expected and very much the norm and should be baked into the thresholds. But it’s worth noting that students do, quite literally, have to gain more points from mid-year to spring than from fall to mid-year to stay at the very bottom of the moderate risk band.

Finally, what would this even mean, practically speaking? For students to be gaining less, literacy-wise, from midyear to spring than from fall to midyear? Like, what would the causal explanation be here? That teachers are doing a worse job teaching in the spring than in the fall? That students are less motivated in the spring? The authors don't actually venture a reason why this might be – which is to their credit. But absent a compelling causal explanation, it seems more likely to me that these results are a statistical artifact rather than some true effect.

I will also note that I don’t think there’s anything malicious about this claim that students are “growing more from fall to mid-year than from mid-year to spring.” I doubt there’s much to the claim, but I don’t think the authors have any sort of agenda or whatever they’re trying to push by making that claim.

If you’re enjoying reading these weekly posts, please consider subscribing to the newsletter by entering your email in the box below. It’s free, and you’ll get new posts to your email whenever they're published.