STAR Results and Proficiency Scores

National Academic and Character Team Databits 2 Comments

My state proficiency scores don’t agree with my STAR results, What’s going on?

Proficiency is a statistically vague term that is defined in vastly different ways by different people. The Merriam-Webster dictionary defines proficient as “well advanced in an art, occupation, or branch of knowledge”. This is fine, but there will be a lot of variation in what observers think is “well advanced”. This vagueness is removed on the state assessments by picking a rather arbitrary cut score on some measure of the skill or knowledge involved. For example, we can define a proficient runner as one who can run 100 meters in under 12 seconds. This results in the strange situation where an athlete running the 100 meter in 12.01 seconds is not proficient and is in the same category as an “athlete” that requires 26.45 seconds. A similar athlete that completes the same race in 11.59 seconds is now in the same category as Usain Bolt.  This reduction of a continuous variable that contains a lot of information into a rough yes/no dichotomy seems absurd, but it is exactly what happens with the state tests and the “proportion proficient” scale used to judge schools.

When we have artificial cut scores on two continuous measures (STAR and the state test), we have to be careful not to trip up on different definitions of proficiency. For example, Imagine has gone from using a commonly accepted definition of “on grade level or above” (PR >= 40) as the definition of benchmark or advanced to one that indicates that a student who is in the middle of the national distribution or above (PR >= 50) as benchmark or advanced. These are well accepted statistical definitions. However, these categories were not intended to mean proficient or to equate to any state’s definition of proficient. A student at the median (PR=50) may or may not be proficient, depending on how you interpret that term.

If we assume, for the sake of discussion, that the distribution of student academic abilities is the same in all states (not true, of course), we can compare approximate PR definitions of “proficient” for each state. For each subject, the proportion meeting standard is the passing rate and the PR equivalent is that value expressed in percentile rank terms (e.g. for Colorado, a student would have to score equal to or better than 60% of the students who took the state ELA assessment in order to be considered proficient).

Proportion Passing (meeting standards, proficient, etc.) Overall by State

 ELA – ReadingMathematics
StateProportion meeting standardPR EquivalentProportion meeting standardPR Equivalent
Colorado0.40600.3070
Florida0.52480.5743
Maryland0.39610.3367
Ohio0.55450.6040
California0.49510.3763
Arizona0.38620.3862

In order to resolve the issue of such different definitions of “proficient” across the states, Renaissance has conducted a linkage study for each state that has resulted in the optimal scale score on STAR, by subject and grade level, to differentiate between “predicted proficient” and “predicted not proficient” for that state test. These optimal cut scores are available in the state linkage study reports, links to which are contained in the first blog post, “How Optimizing Performance on STAR Relates to State Assessments”, dated September 13, 2016. They, and the other information contained in the linkage studies, form the basis for the State Performance – Class and State Performance – Student Reports and the State Standards – Class and State Standards – Student Reports.

Great care must be utilized when attempting to predict an individual student’s performance on a state test. Many idiosyncratic issues or occurrences can influence a student’s score on a single test and these sources of error combine when one assessment is used to predict another. When we are reducing the information in both assessments into a dichotomy (predict pass, predict fail to pass, fail) these potential errors are magnified.

I am convinced that teachers who work with ALL students using ALL of the information available (including STAR, focus assessments, checks for understanding, personal knowledge, etc.) and moving them as far as possible will be a winning strategy to optimize the results on any state test.

Comments 2

  1. This is a great, technical post and it’s interesting to note the dichotomy of the proportion meeting the standard and PR equivalent for each state. It certainly details the gap we see between STAR and the state tests.

    What recommendation does STAR or our National Academic team have for better aligning the increasing focus on writing for our state tests. For example, in Ohio, we have the online AIR assessment, which has a major focus on writing within the ELA portion. However, STAR currently doesn’t assess writing. Has there been feedback offered to Renaissance to include more writing-based tasks or at least a greater linkage between STAR and the state test?

  2. Thank you, I am glad that you enjoyed it. I appreciate your continuing the conversation by asking questions.

    We posed this question to Renaissance and their response is:

    “Star will be linked to Ohio AIR as soon as Ohio students take the assessment and we have enough data to complete the linking. I will share your feedback regarding the inclusion of writing-based tasks in Star or other Renaissance products. Thank you for your suggestion. ”
    Have a wonderful day!

    The Florida state test was developed by AIR and the relationship between STAR and the Florida Standards Assessment is very high (see my first blog post). I am sure that this strong relationship will also be the case with the Ohio test.
    Writing has a significant skill (as well as knowledge) component and is improved by practice. I am convinced that one of the best ways to fully learn something is to be able to explain it to someone else through writing a cogent, understandable explanation. Writing is something that can be utilized in every class (including mathematics) and should be utilized in every subject and classroom to not only improve a student’s writing skills, but also to help them to more fully understand the subject matter that they are writing about.

Leave a Reply

Your email address will not be published. Required fields are marked *