Published in the Oct. 28 – Nov. 10, 2015 issue of Morgan Hill Life

 

mhusd-logoLast spring, Morgan Hill Unified School District students in grades three to eight and grade 11 joined 3.2 million students throughout the state in taking the new computer math and English language arts/literacy tests under the California Assessment of Student Performance and Progress program. The exam helps determine how well Common Core is doing as an educational initiative that details what K–12 students should know in English language arts and mathematics at the end of each grade.

California State Superintendent of Public Instruction Tom Torlakson released the results Sept. 9 for the Smarter Balanced Assessment Consortium test, which comprises the largest component of the CASPP. MHUSD’s overall score of all students showed 50 percent met or exceeded the SBAC standards for English language arts and literacy and 40 did so for math. In comparison, for California as an entire state 44 percent of students met the English standard and 33 percent met the math standard.

Some people in Morgan Hill, including some school board trustees, have decried the local SBAC scores as an indication that something is not working in MHUSD classrooms in teaching Common Core curriculum.

The challenge for many is understanding that SBAC is a “snapshot” of a student’s individual performance and it can not be used to rank schools in the same way that parents and real estate agents used the old Academic Performance Index (API).

Comparing the two is unfair, the equivalent of comparing clocks and yardsticks as measuring devices. Common Core is about learning how to learn rather than what you learn. It’s skill above content.

Another problem is that this is the first year that SBAC is being used, so the ranking was intentionally kept low to provide a calibration point (a baseline for growth model) to allow for room to range upward or downward as testing of students is done in the future. If you set the baseline too high, you risk hitting a ceiling in future years, making testing essentially worthless.

That being said, there is valid concern about SBAC assessment in California. San Jose State University Professor Roxana Mariachi notes that a significant challenge with SBAC involves the apparent inability of the California Department of Education to produce evidence the test results are reliable and valid.

Doug McRae, a retired testing specialist from Monterey, told the California State Board of Education last month that the question for Smarter Balanced test results is not the relationships to old STAR data on the CDE website, but rather the quality of the scores being provided to local districts and schools. “These scores should be valid, reliable, and fair, as required by California statute as well as professional standards for large scale K-12 assessments,” he told the board.

When McRae made a public records request to the CDE for documentation of validity, reliability and fairness for Smarter Balanced tests, either in CDE files or obtainable from the Smarter Balanced consortium, the CDE’s reply letter said no such information was in its files.

The challenge for MHUSD is how to educate parents and the public as a whole on what SBAC scores truly mean. The district has to go against a public perception for assessment that has come out of 15 years of APIs and CSTs scores and school ranking. It’s only human that the public mind wants to take those scores and apply it to school ranking as has been done in previous years with the other tests. This isn’t what SBAC was meant to achieve at all because it tests individual performance of students, not the schools.

Another concern we have is that the SBAC is a “computer adaptive test” which means it does not test all children with the same questions. Adaptive tests create a prejudice of the student’s capability because it assigns new questions based on the student’s performance on previous questions.

“The design tracks students into a presumed potential or lack of potential,” MHUSD Superintendent Steve Betando told us. “The student is then scored on both the number of correct answers and the difficulty of the questions completed. The test is designed so that most students will answer about half the questions correctly and half incorrectly. However, this adaptivity can never provide an accurate rating for true understanding and potential.”

SBAC’s design also does not provide data for each student to account for social influences that might affect flawed adaptivity tracking.
“The test design structurally oppresses students who may not have rich language experiences in early questions by preventing them from having an opportunity to demonstrate high performance in later questions,” Betando said.

According to the Smarter Balanced Assessment Consortium’s own data, more than 90 percent of the English Language learners taking the SBAC will be labeled as failures. This can lead to discrimination and racism that can divide a school and a community. It might even bring about segregation if it encourages the process of grouping and regrouping of students into certain demographics group that do well no matter where they are because of socio-economic advantages.

Kids with language barriers and limited parental support will have a harder time in the testing process than kids with English and home advantages. But that doesn’t mean the disadvantaged students are less smart.

The challenge is to help those students despite the barriers to gain the experiences, the knowledge, the skills and the interaction they need as quickly as possible so that they can move forward.

We encourage school board members and parents to take SBAC test results for what they were intended to be — a tool for helping schools form their Common Core instruction. It definitely should not be used as an indicator of how students, teachers and schools are doing — and especially not as a marketing tool to promote real estate sales.