Tuesday, May 31, 2016

To: NJ State BOE: I Can't Believe We Still Have to Protest This Crap


Tomorrow, the New Jersey State of Education is taking its last public testimony on the use of PARCC as the graduation requirement. It took a long time to decide what to write about. What more can possibly be said that has not already been said by me and many, many others? A friend joked that he would simply state, "Please refer to my previous few testimonies" and then icily stare them down for the rest of the allotted speaking time. I was thinking along those lines, but this came out instead. 
1 June 2016New Jersey State Board of Education 
Testimony on PARCC for graduation requirement: I can’t believe we still have to protest this crap.
The title of this testimony is not meant to be disrespectful to this Board, merely a demonstration of my frustration with having to continually appear before this Board and provide testimony that will only fall on deaf ears. Whether that testimony is heartfelt, stemming from personal experiences that none of you currently sitting on this Board would have any first-hand knowledge of, because (1) you don’t have children in public school, and/or (2) you aren’t the parent of a student with a disability. There is barely a hint of recognition on your part that you are missing a great deal by not listening to and engaging with the actual stakeholders in this mess, namely, parents and their children. 
It also hasn’t mattered when testimony has been presented with hard facts and figures. Unbelievably, there appears to be no curiosity at all about why there is so much pushback on Common Core and PARCC testing. This isn’t just some little hiccup. This is a monumental policy failure that will impact schools, teachers, and students, actual people, for a very long time. Your response? Do more of the same. It is remarkably lazy policy.
So here it is. PARCC is a failure. All of that time and money for a failure. Last October, Mathematica released the results of a study comparing PARCC to MCAS, the Massachusetts state standardized test, and their predictive validity for college and career readiness. This is highly relevant since you are about to make PARCC the gatekeeper for a high school diploma in this state. The education policy that you endorse is only about that insipidly narrow focus on the yet undefined term “college and career ready” as determined by a score on PARCC.
The Mathematica study looked at Grade 10 Math II, Algebra II, and ELA. (Let me remind you here, that Drs. Tienken, Kim, and Sforza took a look at that grade in a study and found the Common Core Standards to be well below the former NJ standards. See my testimony from 10 February 2016. note: I wrote about it here.) The result, from “key findings” on page ix of the report, “Both the MCAS and PARCC predict college readiness. Scores on the assessments explain about 5 to 18 percent of the variation in first-year college grades…” What does this mean exactly? It means that 82 to 95 percent CANNOT be explained by the results of the PARCC test. So how can that possibly, validly, predict “college and career readiness”???  Answer: It can’t. It doesn’t. 
Dr. William Mathis, managing director of the National Education Policy Center at the University of Colorado, former Deputy Assistant Commissioner for the state of New Jersey, Director of its Educational Assessment program, a design consultant for the National Assessment of Educational Progress (NAEP) and for six states, had this to say about the Mathematica report in a Washington Post article on 27 May 2016, “A tour through the literature shows that predictive validity coefficients are quite low in general and commonly run in the 0.30’s. One conclusion is that the PARCC is just about as good as any other test — which is the report’s finding in regard to the MCAS. On the contrary, the more correct conclusion is that standardized tests can predict scores on other standardized tests (which this report confirms) but it cannot validly predict college readiness at any meaningful level.” 
He also said, “With such low predictability, you have huge numbers of false positives and false negatives. When connected to consequences, these misses have a human price. This goes further than being a validity question. It misleads young adults, wastes resources and misjudges schools.  It’s not just a technical issue, it is a moral question. Until proven to be valid for the intended purpose, using these tests in a high stakes context should not be done.
The response to Dr. Mathis, from the creators of the Mathematica study, “Mr. Mathis is also correct that the correlations are low enough that many students (and parents, and colleges) would overestimate or underestimate their true college readiness—if they relied only on the test score to make the judgment. Fortunately, students have lots of other information available to inform their judgments alongside the test scores (most importantly, their high school grades). We wouldn’t recommend that anyone rely exclusively on the test score for high-stakes decisions.
So, why would this Board consider, even for minute, further wasting time and tax-payer money on PARCC? Let alone use such a faulty measure as an obstacle for the students in this state? It is your obligation to the public to not allow PARCC to be used as a graduation requirement. 



Thanks, Mike Simpson, for the picture.

Saturday, May 14, 2016

Top Secret: "The PARCC Test: Exposed" (edited and updated)


The original publish date of this piece was 14 May 2016. It was removed on 16 May 2016 because it contained three partial questions from the then current PARCC test. A Columbia University professor had been sent PARCC questions by a horrified k-12 teacher and decided to share them in a blog post, which I republished below.

At the time, Blogger simply returned the published post to "Draft" and I decided to not repost. My Tweets were also removed from Twitter. Twitter did inform me of the decision to remove the Tweets, but I never heard a word from Google or Blogger - just the removal of the post. Later, a friend sent me this link and this link to Lumen, which does have some explanation. I still take issue with the decision because I believe the fair use doctrine of copyright law applies to the publishing of the partial test questions. 


Today (23 January 2017), I received this via email (with no specific information about what needs to be removed from the post) and decided to republish, per Blogger, removing the "offending content" (the PARCC questions). The email also provides no explanation for the reason why I received this email today. 

Blogger has been notified, according to the terms of the Digital Millennium Copyright Act (DMCA), that certain content in your blog is alleged to infringe upon the copyrights of others. As a result, we have reset the post(s) to "draft" status. (If we did not do so, we would be subject to a claim of copyright infringement, regardless of its merits. The URL(s) of the allegedly infringing post(s) may be found at the end of this message.) This means your post - and any images, links or other content - is not gone. You may edit the post to remove the offending content and republish, at which point the post in question will be visible to your readers again.
A bit of background: the DMCA is a United States copyright law that provides guidelines for online service provider liability in case of copyright infringement. If you believe you have the rights to post the content at issue here, you can file a counter-claim. In order to file a counter-claim, please see https://support.google.com/legal/contact/lr_counternotice?product=blogger.
The notice that we received, with any personally identifying information removed, will be posted online by a service called Lumen at https://www.lumendatabase.org. We do this in accordance with the Digital Millennium Copyright Act (DMCA). You can search for the DMCA notice associated with the removal of your content by going to the Lumen page, and entering in the URL of the blog post that was removed.
If it is brought to our attention that you have republished the post without removing the content/link in question, then we will delete your post and count it as a violation on your account. Repeated violations to our Terms of Service may result in further remedial action taken against your Blogger account including deleting your blog and/or terminating your account. DMCA notices concerning content on your blog may also result in action taken against any associated AdSense accounts. If you have legal questions about this notification, you should retain your own legal counsel.
Sincerely,
The Blogger Team
Affected URLs: http://elfasd.blogspot.com/2016/05/top-secret-parcc-test-exposed.html
The original post about what is so wrong with the PARCC questions, and why the Columbia professor was so astonished at the absurdity of the test questions, is still noteworthy and, I think, valuable. I have noted where text has been removed.  


Leonie Haimson is asking bloggers to post the original blog post, The PARCC Test: Exposed. The redacted version can be found here. PARCC has been removing posts on Twitter and auto-flagging posts of the blog on Facebook. 


As you read this, consider the absurdity of Pearson, PARCC, or any organization trying to keep secret the items on a test which is given to millions of students nationwide. Consider the awful quality of the test questions - what are we paying for? Then, think about all those written responses scored by a computer. Does this sound like something we should be paying millions of dollars for? 

The PARCC Test: Exposed
The author of this blog posting is a public school teacher who will remain anonymous.
I will not reveal my district or my role due to the intense legal ramifications for exercising my Constitutional First Amendment rights in a public forum. I was compelled to sign a security form that stated I would not be “Revealing or discussing passages or test items with anyone, including students and school staff, through verbal exchange, email, social media, or any other form of communication” as this would be considered a “Security Breach.” In response to this demand, I can only ask—whom are we protecting?

There are layers of not-so-subtle issues that need to be aired as a result of national and state testing policies that are dominating children’s lives in America. As any well prepared educator knows, curriculum planning and teaching requires knowing how you will assess your students and planning backwards from that knowledge. If teachers are unable to examine and discuss the summative assessment for their students, how can they plan their instruction? Yet, that very question assumes that this test is something worth planning for. The fact is that schools that try to plan their curriculum exclusively to prepare students for this test are ignoring the body of educational research that tells us how children learn, and how to create developmentally appropriate activities to engage students in the act of learning. This article will attempt to provide evidence for these claims as a snapshot of what is happening as a result of current policies.

The PARCC test is developmentally inappropriate
In order to discuss the claim that the PARCC test is “developmentally inappropriate,” examine three of the most recent PARCC 4th grade items.

A book leveling system, designed by Fountas and Pinnell, was made “more rigorous” in order to match the Common Core State Standards. These newly updated benchmarks state that 4th Graders should be reading at a Level S by the end of the year in order to be considered reading “on grade level.” [Celia’s note: I do not endorse leveling books or readers, nor do I think it appropriate that all 9 year olds should be reading a Level S book to be thought of as making good progress.]

The PARCC, which is supposedly a test of the Common Core State Standards, appears to have taken liberties with regard to grade level texts. For example, on the Spring 2016 PARCC for 4th Graders, students were expected to read an excerpt from Shark Life: True Stories about Sharks and the Sea by Peter Benchley and Karen Wojtyla. According to Scholastic, this text is at an interest level for Grades 9-12, and at a 7th Grade reading level. The Lexile measure is 1020L, which is most often found in texts that are written for middle school, and according to Scholastic’s own conversion chart would be equivalent to a 6th grade benchmark around W, X, or Y (using the same Fountas and Pinnell scale).

Even by the reform movement’s own standards, according to MetaMetrics’ reference material on TextComplexity Grade Bands and Lexile Bands, the newly CCSS aligned “Stretch” lexile level of 1020 falls in the 6-8 grade range. This begs the question, what is the purpose of standardizing text complexity bands if testing companies do not have to adhere to them? Also, what is the purpose of a standardized test that surpasses agreed-upon lexile levels?

So, right out of the gate, 4th graders are being asked to read and respond to texts that are two grade levels above the recommended benchmark. After they struggle through difficult texts with advanced vocabulary and nuanced sentence structures, they then have to answer multiple choice questions that are, by design, intended to distract students with answers that appear to be correct except for some technicality.

Finally, students must synthesize two or three of these advanced texts and compose an original essay. The ELA portion of the PARCC takes three days, and each day includes a new essay prompt based on multiple texts. These are the prompts from the 2016 Spring PARCC exam for 4th Graders along with my analysis of why these prompts do not reflect the true intention of the Common Core State Standards.

ELA 4th Grade Prompt [Deleted per DMCA]

The above prompt probably attempts to assess the Common Core standard RL.4.5: “Explain major differences between poems, drama, and prose, and refer to the structural elements of poems (e.g., verse, rhythm, meter) and drama (e.g., casts of characters, settings, descriptions, dialogue, stage directions) when writing or speaking about a text.”

However, the Common Core State Standards for writing do not require students to write essays comparing the text structures of different genres. The Grade 4 CCSS for writing about reading demand that students write about characters, settings, and events in literature, or that they write about how authors support their points in informational texts. Nowhere in the standards are students asked to write comparative essays on the structures of writing. The reading standards ask students to “explain” structural elements, but not in writing. There is a huge developmental leap between explaining something and writing an analytical essay about it. [Celia’s note: The entire enterprise of analyzing text structures in elementary school – a 1940’s and 50’s college English approach called “New Criticism” — is ridiculous for 9 year olds anyway.]

The PARCC does not assess what it attempts to assess

ELA 4th Grade Prompt [Deleted per DMCA]

It would be a stretch to say that this question assesses CCSS W.4.9.B: “Explain how an author uses reasons and evidence to support particular points in a text.”

In fact, this prompt assesses a student’s ability to research a topic across sources and write a research-based essay that synthesizes facts from both articles. Even CCSS W.4.7, “Conduct research projects that build knowledge through investigation of different aspects of a topic,” does not demand that students compile information from different sources to create an essay. The closest the standards come to demanding this sort of work is in the reading standards; CCSS RI.4.9 says: “Integrate information from two texts on the same topic in order to write or speak about the subject knowledgeably.” Fine. One could argue that this PARCC prompt assesses CCSS RI.4.9.

However, the fact that the texts presented for students to “use” for the essay are at a middle school reading level automatically disqualifies this essay prompt from being able to assess what it attempts to assess. (It is like trying to assess children’s math computational skills by embedding them in a word problem with words that the child cannot read.)

ELA 4th Grade Prompt [Deleted per DMCA]
Nowhere, and I mean nowhere in the Common Core State Standards is there a demand for students to read a narrative and then use the details from that text to write a new story based on a prompt. That is a new pseudo-genre called “Prose Constructed Response” by the PARCC creators, and it is 100% not aligned to the CCSS. Not to mention, why are 4th Graders being asked to write about trying out for the junior high track team? This demand defies their experiences and asks them to imagine a scenario that is well beyond their scope.

Clearly, these questions are poorly designed assessments of 4th graders CCSS learning. (We are setting aside the disagreements we have with those standards in the first place, and simply assessing the PARCC on its utility for measuring what it was intended to measure.)

Rather than debate the CCSS we instead want to expose the tragic reality of the countless public schools organizing their entire instruction around trying to raise students’ PARCC scores.

Without naming any names, I can tell you that schools are disregarding research-proven methods of literacy learning. The “wisdom” coming “down the pipeline” is that children need to be exposed to more complex texts because that is what PARCC demands of them. So children are being denied independent and guided reading time with texts of high interest and potential access and instead are handed texts that are much too hard (frustration level) all year long without ever being given the chance to grow as readers in their Zone of Proximal Development (pardon my reference to those pesky educational researchers like Vygotsky.)

So not only are students who are reading “on grade level” going to be frustrated by these so-called “complex texts,” but newcomers to the U.S. and English Language Learners and any student reading below the proficiency line will never learn the foundational skills they need, will never know the enjoyment of reading and writing from intrinsic motivation, and will, sadly, be denied the opportunity to become a critical reader and writer of media. Critical literacies are foundational for active participation in a democracy.

We can look carefully at one sample to examine the health of the entire system– such as testing a drop of water to assess the ocean. So too, we can use these three PARCC prompts to glimpse how the high stakes accountability system has deformed teaching and warped learning in many public schools across the United States.

In this sample, the system is pathetically failing a generation of children who deserve better, and when they are adults, they may not have the skills needed to engage as citizens and problem-solvers. So it is up to us, those of us who remember a better way and can imagine a way out, to make the case for stopping standardized tests like PARCC from corrupting the educational opportunities of so many of our children.