Throughout his tenure, Gov. Glenn Youngkin’s administration has focused on the need to “raise expectations” in Virginia public education. As proof, officials have pointed to drops in proficiency ratings and test scores on both state assessments and the National Assessment of Educational Progress.
The culprit, says the Youngkin administration, are decisions by the previous Virginia Board of Education to lower the cut scores on student assessments and change the state’s standards of school accreditation.
However, the few board members that have remained since Youngkin took office said there are other factors to consider in determining why student test scores have dropped, including socioeconomic issues and the rigor of the tests being administered.
The debate can be hard for non-policy experts to follow. Here’s what to know.
The Standards of Learning and NAEP
Before Virginia education officials determine what students need to learn at a particular grade level, they craft the Standards of Learning — a series of educational objectives in mathematics, reading, writing and history and social science that students are expected to meet and demonstrate proficiency in through state tests.
Virginia first administered SOL tests in 1998. Student scores are classified as failing/basic, proficient or advanced.
Virginia defines “proficient” as “evidence that the student demonstrated the skills and knowledge defined in the Standards of Learning as appropriate for the grade level or course.”
Besides taking the Standards of Learning, representative groups of fourth and eighth grade students in Virginia and all other states also take the National Assessment for Educational Progress, or NAEP. This assessment of mathematics, reading, science and writing is mandated by Congress and has its own system of classifying results as basic, proficient and advanced.
Cut scores are the test scores used by state education agencies and boards of education to classify whether students are proficient or advanced in a given subject. For example, a “proficiency” cut score might be 26 right answers in a set of 50 questions. An “advanced” cut score might be 45 right answers out of 50.
Across the country, state boards use two methods to determine their cut scores: the Angoff method or the bookmark method.
Virginia uses the Angoff method, which determines cut scores before tests are given based on the content of the assessment. The bookmark method defines cut scores after tests are administered based on test data. Both methods rely on the use of education experts to determine how rigorous tests are and where the cut scores should be set.
In Virginia, three different groups — a standard setting committee, an articulation committee and the superintendent — put forward recommendations of what the cut scores should be for each subject and each grade. The Board of Education then votes to adopt the final cut scores.
Student test results
Since August, test scores from Virginia’s public schools have been under a microscope by educators and critics.
That month, SOL results revealed that while students performed better in 2021-22 after returning to in-person learning compared to 2020-21, pass rates were below the state’s baseline.
Two months later, NAEP results showed declines in Virginia in both reading and math between 2019 and 2022, and continuous drops in fourth graders’ proficiency since 2017.
Recent Virginia cut score and accreditation decisions
The current debate concerns the board’s decision in 2017 to change accreditation requirements, which went into effect during the 2018-19 school year. In 2019, the board also lowered cut scores in Grades 3-8 mathematics, and in 2020, it lowered them in reading for the same grade levels.
Overall, the board lowered the proficiency cut scores by one to four points in every grade level in reading and mathematics. For example, Grades 6 through 8 saw a three-point drop in the reading proficiency scores, and Grades 4 through 8 saw four-point drops in the mathematics proficiency scores.
Board President Daniel Gecker and Vice President Tammy Mann, appointees of Democratic Govs. Terry McAuliffe and Ralph Northam, told the Mercury that the board adopted reading cut scores based on recommendations provided by the superintendent and articulation committee, which is made up of subject experts.
Gecker said in an email that the board saw the purpose of the cut scores on the SOLs as providing an assessment of whether a student has sufficient knowledge to advance to the next grade.
“The scores were set at that level as recommended by the articulation committee,” Gecker wrote. “At that time we understood the potential for political fallout (the issue of having too many children pass the tests is not a new one).”
Prior to the cut score changes, the Board of Education also revised its school accreditation guidelines, the criteria set by the state for whether a school meets certain expected educational standards. The new accreditation standards included measures beyond student performance on tests, including chronic absenteeism and other factors.
The administration’s perspective
The Youngkin administration said the 2019 and 2020 decisions to lower the cut scores, coupled with the 2017 accreditation standards changes, lowered state expectations for students and drove down student achievement, even before the commonwealth shut down in-person classes due to the COVID-19 pandemic.
“The State Board of Education changed its accreditation requirements in 2017 to de-emphasize grade-level proficiency in reading and math,” former Superintendent of Public Instruction Jillian Balow wrote in a scathing 34-page report issued in May 2022. “Despite the gaps between state and national proficiency standards, the State Board of Education voted to lower the proficiency cut scores … on Standards of Learning (SOL) tests in math and reading in 2019 and 2020, respectively.”
During a Jan. 19 Senate Education Committee hearing, Secretary of Education Aimee Guidera argued higher cut scores have historically led to higher student achievement in Virginia.
In 1998, for example, she said, Virginia fourth graders achieved higher proficiency ratings in reading on the NAEP compared to the national average. Then, in 2003 and 2013, Virginia fourth graders pulled further ahead of the national average in reading, an outcome Guidera said was linked to the Board of Education’s decision to raise its Standards of Accreditation.
“When we continue to benchmark our standards to what the world requires and our students deserve, students perform and it works,” Guidera said.
In October, following the release of the NAEP results, Youngkin said the accreditation system “masked the fact that we are failing too many of our students across the commonwealth.”
At the same briefing, Balow accused the previous two administrations of systematically lowering standards for schools and expectations for students. She added that policymakers made the “conscious decision to end the practice of driving expectations upward.”
Before the release of the NAEP results, Balow’s May 2022 report argued that prior increases in “the rigor” of accreditation standards linked to the inclusion of high school graduation benchmarks, higher expectations for elementary reading and the replacement of multiple choice tests with assessments based on content knowledge led to more students performing well on the SOLs and NAEP.
“This culture of excellence took a wrong turn in 2015 as the State Board of Education began a review of its accreditation regulations, culminating in a 2017 adoption of accreditation standards that watered down the importance of grade-level proficiency,” she wrote. The new standards, she continued, were followed by “a steady decline in student achievement on state SOL tests” as well as on NAEP.
The accreditation changes, Guidera said, “erased 20 years of success in the past five years.”
Mark Schneider, director of the Institute of Education Sciences within the U.S. Department of Education, said during an Oct. 19 meeting that while Virginia students “are, on average, doing pretty good,” the state’s SOL standards and cut scores are too low.
“It’s not like the performance of your students is bad — it’s not as good as you want — it’s just that the way in which you’re setting your cut score for proficient is so far below where it should be,” he said.
He continued that “when you tell me that 68% of your students are proficient, and I’m using a national benchmark, and it says 48, that’s a big disconnect.”
Schneider’s remarks are in line with the administration’s claims that Virginia has an unusually wide “honesty gap,” a term used to describe the difference between state-level and NAEP proficiency standards.
Gecker said it’s difficult to know whether the revisions to the Standards of Accreditation were effective because they were implemented shortly before the pandemic began.
“The standards were in effect only a year before COVID, then they were waived,” Gecker wrote. “It is difficult to say what the impact would be although we know under the previous higher expectation standards the results were not improving.”
Last year, he wrote in response to the superintendent’s 34-page report that research has shown a link between student socioeconomic status and academic achievement, and the number of economically disadvantaged students has increased over more than a decade in Virginia.
According to a 2019 study conducted by the state’s Joint Legislative Audit and Review Commission, Virginia ranked 26th in the nation in state and local per pupil education funding and 42nd in state per pupil funding. It ranked 33rd in average salary for K-12 public teachers.
“We cannot expect to change outcomes — or maintain previous levels of achievement — while starving the system of resources,” Gecker wrote. “And we cannot expect to attract and retain a high-quality cadre of teachers if we continually underpay the profession relative to other college graduates.”
Gecker said it’s “disingenuous” to believe that Virginia students’ lower NAEP results in 2022 were due to a reduction in expectations for both school accreditation and student assessments. He said the claim ignores declines in NAEP scores that were noted by the board and department prior to 2019 and likely resulted from the state continuing to fund public education at a level below that set in 2009, adjusted for inflation.
He also pointed to a study conducted by the Urban Institute last year that found that setting higher expectations on state assessments alone does not impact NAEP scores. The same study found no correlation between the state reading and math standards and NAEP proficiency ratings.
Previous board members have argued cut scores can’t be viewed in isolation but are set depending on the specific assessment and its level of difficulty. More challenging tests may have lower cut scores, while easier tests may have higher scores. Final decisions, say board members, are the result of evaluating recommendations from superintendents and the two committees.
Cut scores are adopted to “reflect the rigor of each test,” said Mann.
Recent Virginia Department of Education documents make similar observations: “It is important to be aware that cut scores must be interpreted in light of the difficulty of the test,” a March 23 agency report reads.
“It’s not like the board sits there and throws a dart at the wall and says, ‘What should the score be?’” Gecker told the Mercury. “It goes through a scientifically validated process and comes to us with recommendations.”
“By and large, the board has followed the modified Angoff method for many years,” he said. “It was not a question of lowering standards.”
What’s happened in the last six months?
Last October, Youngkin asked the Board of Education to raise expectations, including establishing new accountability and accreditation systems, as well as cut scores for reading and math SOLs for Grades 3 through 8, among other proposals intended to address the declining proficiency scores.
“Work to establish new accountability and accreditation systems will include contemplation by the Board about learning proficiency and high expectations for students,” a memo to the board noted. “Raising our cut scores to what we believe is the content and skill mastery needed to be on track for readiness for college and career is foundational.”
On Nov. 17, the board discussed the matter after more than nine hours on the dais but did not act. Since then, the administration has hired a new superintendent, and department staff released a summary of the board’s adopted cut scores at the March 23 meeting.
As a way to improve the process of adopting cut scores, board member Andy Rotherham, a Youngkin appointee, recommended adding experts from higher education to the articulation committee. Board member Anne Holton, a McAuliffe appointee, recommended the board consider impact data, or the prediction of what pass rates will be, in determining future cut scores.
GET THE MORNING HEADLINES DELIVERED TO YOUR INBOX
Our stories may be republished online or in print under Creative Commons license CC BY-NC-ND 4.0. We ask that you edit only for style or to shorten, provide proper attribution and link to our web site. Please see our republishing guidelines for use of photos and graphics.