If the New Jersey Department of Education (NJDOE) and the New Jersey State Board of Ed (NJSBOE) are not listening to the public, who are they listening to? What is their reaction to all of our testimony? NJDOE provided responses to testimony when they released the August 3rd agenda and this is what stood out for me.
In some cases they simply disagreed and said so. In other cases, they had some interesting citations to back up their claims related to validity. And, for the special education-related comments, clarification of just who is in control of the graduation requirements for students with IEPs.
One comment, in particular, stuck out (besides the ones that were aimed at me) because the testimony belonged to Dr. Eric Milou, a Rowan University professor, recipient of the Max Sobel Outstanding Mathematics Educator Award, former president of Association of Mathematics Teachers of New Jersey (AMTNJ) and the National Council of Teachers of Mathematics. This is exactly the kind of education professional this board should have been listening to, but this is their response:
40. COMMENT: The commenter stated there is no evidence the PARCC assessment is an improvement over previous standardized tests, raises student performance, provides useful diagnostic information, or indicates career or college readiness. The commenter also stated only rigorous curriculum, instruction, and the use of formative assessments will have a significant impact on student educational success. (99)
RESPONSE: Several studies (e.g., National Network of State Teachers of the Year, 2015; Massachusetts Executive Office of Education, 2015; Center for American Progress, 2016; Fordham/Human Resources Research Organization, 2016; American Institutes for Research, 2016) have supported PARCC as an accurate measure of college and career readiness and endorsed PARCC as an improvement over previous assessments.
Dr. Milou got right to the heart of what's wrong with standardized tests in general and what's wrong with PARCC specifically. It doesn't actually provide the information that's being claimed. As we pour millions of tax dollars into a highly flawed testing system, shouldn't it, at the very least, do what NJDOE claims? Shouldn't someone, somewhere, define what college and career ready means?
Also relevant is how you go about determining validity and whom you choose to document those claims. Isn't that what we're allegedly trying to help our kids navigate? Knowing who is behind the research supporting your arguments, so you understand and account for undo influence? That's really important stuff, right?
Well, in this case, NJDOE is relying on information from sources that I would consider to be questionable because of where their funding comes from. I'm not going to tip-toe around that because when the same very deep pockets are quietly funding organizations that people trust, we all need to know where those organizations are coming from. I want data, information, opinions, from places where a particular and singular influence can be accounted for. In this case, NJDOE is clearly very happy with anything funded by the Gates Foundation. An entity with a very singular focus on the privatization of US public schools, on Common Core State Standards, and on the associated testing, like PARCC. Nothing the Gates Foundation does or supports is friendly to PUBLIC education.
Let's look at who NJDOE and NJSBOE are listening to:
National Network of State Teachers of the Year (NNSTOY) From the Gates Foundation website: in 2015, NNSTOY was awarded a $1,000,000 grant "to improve student learning across the nation by defining, sharing and advocating for effective teaching practices and policies."
NJDOE didn't bother to name any of the studies to which they refer, but I'll presume they are talking about "The Right Trajectory" study released earlier this year. Twenty-three Teachers of the Year took a look at PARCC, SBAC, NJASK, NECAP, DCAS, and ISAT at the 5th grade level. They applied Webb's DOK, along with other tools of assessing the level of challenge in each of the tests. The problem is, given how the questions were asked, they didn't appear to actually apply what they found. It reads more like an opinion questionnaire - which would be fine if you weren't trotting it out as evidence of validity. The study does not demonstrate PARCC as "an accurate measure of college and career readiness."
I was not familiar with this particular study and it's interesting to see what these teachers thought of the construct of these tests and, possibly, their usefulness. That said, there is nothing in the study that speaks to the validity of using PARCC to assess college and career readiness as a high school exit exam. I would argue the simple fact that they only looked a 5th grade, and they specifically left out consideration of students with disabilities, means the scope of the study doesn't include anything that supports college and career ready at the high school level. The study's conclusion is that PARCC is more challenging than NJASK. Ok. I'm good with that. NJASK was never written as "deep skills and knowledge" test, so I wouldn't expect them to find it was.
Center for American Progress (CAP) is a heavily Gates Foundation-funded entity. From the Gates Foundation website: Since 2008, up to June 2016, they have been awarded $8,998,810 for everything from "to support Common Core implementation" to "enhance degree completion for low-income young adults through the publishing of new policy papers, stakeholder engagement, and media outreach" to "continue researching, understanding and promoting better human capital policies to benefit all public school students and to tackle the implications of developing education reforms".
I have no idea which study NJDOE refers to in their response. CAP has many "reports" on their website, but nothing that either compares PARCC to anything or demonstrates value in a high school exit exam. If anyone knows or has the study, please send it to me.
Fordham/Human Resources Research Organization (Thomas B. Fordham Institute and HUMRO). This was an interesting way to cite two different studies that worked in parallel. The studies looked at PARCC, 2014 MCAS, ACT Aspire, and SBAC. From the HUMRO
study summary, "A parallel study was conducted by the Thomas B. Fordham Institute (hereafter referred
to as Fordham), which implemented the [The National Center for the Improvement of Educational Assessment]
Center’s methodology for grades 5 and 8 summative
mathematics and ELA/literacy assessments. Taken together, HumRRO and Fordham were first
to implement the Center’s evaluation methodology. HumRRO and Fordham conducted their
studies separately; however, the two organizations communicated often about the evaluation
methodology and collaborated on the steps to implement it."
HumRRO also acknowledges who made their study possible: "This important work was possible from funding by the High Quality Assessment Project (HQAP), which supports state-based advocacy, communications, and policy work to help ensure successful transitions to new assessments that measure K–12 college- and career readiness standards. HQAP’s work is funded by a coalition of national foundations, including the Bill & Melinda Gates Foundation, the Lumina Foundation, the Charles and Lynn Schusterman Family Foundation, the William and Flora Hewlett Foundation, and the Helmsley Trust."
I haven't poked into just how much money that is, but Thomas B. Fordham Institute has been awarded $5,214,650 between 2006 and 2015, "to support the activities of an emerging network of state level education advocacy organizations in support of a convening around strategic issues" and "for general operating support" and "to track state progress towards implementation of standards and to understand how what students read changes in response to the standards."
Interesting to note the Fordham study looked at grades 5-8. Arguably, that has nothing to do with the validity of a high school exit exam for either math or English.
And, the HumRRO study looked at PARCC's PBA and EOY. New Jersey doesn't use their PBA (only the first year, after which they dropped it) and the EOY, starting this year, was allegedly some combo of the PBA and EOY. So what exactly has NJDOE extracted from a study that doesn't talk about PARCC in the form it actually uses?
American Institutes for Research (AIR) is also Gates Foundation-funded, although they are primarily focused on post-secondary education. Since 2009 they have been awarded $9,296,140 in grants. Since NJDOE didn't bother to name which AIR study they were referring to, I'll guess that it's the National Benchmarks for State Achievement Standards 2016
study. The purpose was to look at the quality of college and career ready standards in the test using grades 4 and 8.
From their "key findings," the standards for PARCC ELA are equivalent to NAEP "basic" and PARCC math is equivalent to NAEP "proficient."
Go to page 19 of the study and read the list of "caveats." My favorites?
"Second, in some states, some of the grade 8 mathematics students took an end-of-course test,
such as Algebra 1. In this benchmarking study, this factor could have had the effect of making
the state grade 8 mathematics standards appear higher."
"This should not be interpreted to mean that NAEP’s Proficient levels in grades 4 and 8 are the gold standards for deciding whether our students are on track to be ready for college. No evidence has been presented by NAEP that
the proficient standard in grades 4 and 8 predicts college success."
"Fifth, this report does not, in any way, address or evaluate the quality of the CCSS. The CCSS are content standards, while this report deals only with achievement standards. Content standards represent the curriculum that teachers should teach, and the scope and sequence of what students should learn in school. Achievement standards are cut-scores on the state test that represent performance expectations." Here's what Drs. Tienken, Sforza, and Kim found on the "quality" of CCSS.
Again, grades 4 and 8 were used, not any of the high school grades. There is nothing to support the validity of college and career ready at high school level or as an exit exam.
Massachusetts Executive Office of Education (MEOE) They are, presumably referring to the Mathematica study done last year, comparing MCAS and PARCC for MEOE. Why they didn't just say that, I have no idea. At this point, I have no idea why NJDOE does anything. Anyway, I saved this one for last because I've written about it and provided testimony that is contrary to how NJDOE has framed this study in their support of PARCC. You can read my whole piece here, but I will just share these two particular points in this post:
1. From “key findings” on page ix of the report, “Both the MCAS and PARCC predict college readiness. Scores on the assessments explain about 5 to 18 percent of the variation in first-year college grades…” What does this mean exactly? It means that 82 to 95 percent CANNOT be explained by the results of the PARCC test.
2. Dr. William Mathis, managing director of the National Education Policy Center at the University of Colorado, former Deputy Assistant Commissioner for the state of New Jersey, Director of its Educational Assessment program, a design consultant for the National Assessment of Educational Progress (NAEP) and for six states, had this to say about the Mathematica report in a Washington Post article on 27 May 2016, “A tour through the literature shows that predictive validity coefficients are quite low in general and commonly run in the 0.30’s. One conclusion is that the PARCC is just about as good as any other test — which is the report’s finding in regard to the MCAS. On the contrary, the more correct conclusion is that standardized tests can predict scores on other standardized tests (which this report confirms) but it cannot validly predict college readiness at any meaningful level.”
You could probably write a book about how much these studies do NOT support using PARCC as a college and career high school exit exam. I think NJDOE and NJSBOE need a lesson in how to read studies like these and how to properly draw conclusions from them.
I will say, again, that having public ed policy so constrained by standardization is nothing but lazy. It does not serve our children. It does not serve our society. I am furious that we all have to wait in hope of a Governor who will have much higher expectations of public education in New Jersey. And who understands that test scores are incredibly limited in their usefulness. Our kids deserve nothing less.