kitchen table math, the sequel: 1/6/13 - 1/13/13

Saturday, January 12, 2013

in other words, the answer is homeschooling

For some reason, I stumbled upon an Education News interview with Joshua Angrist that contains the following exchange:
[QUESTION]
School quality and human capital ARE major issues for all Americans. But we all know that some schools are failing. What can the typical parent do for their child other than attempt to home school?

[ANSWER]
Some schools are better than others. For many parents, however, this is not worth worrying about. For example, I never worried much about my kids schooling. I told them that teaching is hard, many teachers are mediocre at best, and they should try to get something out of badly taught classes as well as inspiring ones. The evidence suggests that’s a reasonable approach for children in educated families like mine. I worry most about the children of teen mothers, from families where there isn’t much adult supervision, little in the way of role models, and little hope for a middle class life. In this situation, a good school can make a huge difference.

An Interview with Josh Angrist: School Quality - Who Decides? 
For white kids, good enough is good enough.

My first reaction was exasperation. Don't worry, be happy doesn't cut it, and I am in a position to know. I am the parent of a (white) college freshman, and I teach (some) white college freshman as well as black and Hispanic students. None of them -- black, white, or brown -- are where they should be, and Ed, who occasionally teaches (mostly white) freshmen at NYU, will tell you the same.

Maybe "the evidence" used to "suggest" that graduating high school as a white 18-year old with mediocre skills was "a reasonable approach," but this interview was published just 9 months ago, and things are different now.

See, e.g., the Hamilton Jobs Gap calculator. If the economy continues to create 155,000 jobs per month, which has been the rate for the past two years, full employment does not return until after 2025. At that point today's college freshmen will be 31 years old and will have spent their first decade of employment in a buyer's market for labor. In a buyers' market, employers have more applicants than they can sort through and, often, no real need to hire if they can't find a purple squirrel.

See Urban Dictionary for the expression that covers that situation.


"Never worrying much" about your kids' schooling is a luxury white parents no longer enjoy, along with all those other luxuries that disappeared when household wealth fell by 40% in 3 years time. Assuming Angrist is right about what parents "can do" (nothing) and I'm sure he is, then homeschooling is the answer.

That was my first reaction.

My second reaction was: jeeeeeezzz.

"...many teachers are mediocre at best..."

"...the children of teen mothers, from families where there isn’t much adult supervision, little in the way of role models..."

In one short paragraph, he's managed to insult both teachers and a fair share of minority parents, while dismissing afterschoolers and math warriors out of hand.

Pretty efficient, I'd say.

Friday, January 11, 2013

mom of 4 fills out a survey

A classic school story from mom of 4------
the edworld (among others) doesn't give up its stranglehold easily. Our old district was apparently required to send out parent surveys to find out if the HS cluster wished to continue the 7-8 JHS format or move to the MCPS-proposed 6-7-8 MS format. The vote was something over 90% wishing to keep the JHS format, but we got the MS anyway. Apparently, there was no requirement that anyone read the results or follow the parent wishes. All of the best things (academics) about the JHS while my older kids were there had been lost and all of the worst things (artsy-crafty, touchy-feely, non-academic) came from ES to the MS. My younger kids hated it. Sigh
I'm laughing!

Not one hour ago, I finally worked up the energy to track down this year's "Student Performance Review"...which I find has, once again, failed to disaggregate the 3s from the 4s!

Surprise!

I spent six years of my life trying to get my district to disaggregate the 3s from the 4s. Before that a parent who works at CUNY and was involved in the standards movement here in New York state spent a lot of time and energy doing the same.

Now it's 2013, C. is in college, and the 3s and the 4s are still one.

update 1/18/2013: No! I'm going blind! The 3s and the 4s are disaggregated! (I thought I remembered LJ - the CUNY dad - having finally prevailed on that one ---- )

[pause]

I've just searched the Irvington Parents Forum for the word "disaggregate." I find a post dated 10/10/2007 that notes the 3s-and-4s issue but points out that a combined 3s-and-4s category is an improvement on what came before.

You should all take a look at that post if you have a moment. It includes a note from the then-principal of the middle school refusing to tell Ed and me how the black and Hispanic students are doing on the state tests. The year before, I think it was, not one black or Hispanic 8th grade student had passed. So we were asking for data.

No Child Left Behind was supposed to spark that kind of behavior, and in our case it did.

Reading that old post, I think we did make a bit of progress. There was a time, not too long after, when the administration-slash-board required people to FOIL everything. (Which we did.) The only documents you could get from the school without a FOIL request were a school calendar and your child's report card. Things were so bad that my friend Robyne, who had been elected to the board in a landslide vote, had to FOIL district documents.

Today we have disaggregated 3s and 4s!

Jo Boaler should release her data

background: Educational malpractice for the sake of reform math

FERPA protects the identities of individual students, not schools.

If FERPA protected the identities of schools, No Child Left Behind would be illegal. Clearly it is not.

Ms. Boaler should release her data.

Thursday, January 10, 2013

equator therapy

Back from Aruba --- and wow!

I am in good spirits.

Equator sun.

I recommend it!

help desk – online instruction for New York Regents geometry course

Any ideas on an online course that might be appropriate for a student who has been struggling with New York State Regents geometry? The course would have to prepare the student to pass the NYS geometry Regents exam. What about online tutoring options? While this student is self-directed in some ways, he is not one to “teach himself” from a book or other resources. He probably needs direct instruction, with targeted feedback and guidance on his progress. Should online even be an option in this case? Does he simply need a traditional tutor? Maybe a blended learning option would work well? Any other alternatives?

Any and all ideas would be welcome.

Tuesday, January 8, 2013

Educational malpractice for the sake of Reform Math

A couple of weeks ago, James Milgram, an emeritus Professor of Mathematics at Stanford University, updated me on some recent developments in the controversy over Jo Boaler's "Railside Study." It was only after I reviewed the various critiques, accusations, and rebuttals that I remembered what an enormously consequential case of educational malpractice is afoot here--one that deserves much wider attention than it's gotten so far.

Professor Milgram is known in the education world for his comprehensive critique of a study done by Jo Boaler, an education professor at Stanford, and Megan Staples, then an education professor at Purdue. Boaler and Staples' paper, preprinted in 2005 and published in 2008, is entitled Transforming Students’ Lives through an Equitable Mathematics Approach: The Case of Railside School. Focusing on three California schools, it compares cohorts of students who used either a traditional algebra curriculum, or the Reform Math algebra curriculum The College Preparatory Mathematics (CPM). According to Boaler and Staple's paper, the Reform Math cohort achieved substantially greater mathematical success than the traditional math cohorts.

In early 2005 a high ranking official from the U.S. Department of Education asked Professor Milgram to evaluate Boaler and Staples' study. The reason for her request? She was concerned that, if Boaler and Staples' conclusions were correct, the U.S. department of education would be obliged, in Milgram's words, "to begin to reconsider much if not all of what they were doing in mathematics education." This would entail an even stronger push by the U.S. educational establishment to implement the Constructivist Reform Math curricula throughout K12 education.

Milgram's evaluation of Boaler and Staples' study resulted in a paper, co-authored with mathematician Wayne Bishop and statistician Paul Clopton, entitled A close examination of Jo Boaler's Railside Report. The paper was accepted for publication in peer-reviewed journal Education Next, but statements made to Milgram by some of his math education colleagues caused him to become concerned that the paper's publication would, in Milgram's words, make it "impossible for me to work with the community of math educators in this country"--involved as he then was in a number of other math education-related projects. Milgram instead posted the paper to his Stanford website.

This past October a bullet-point response to Milgram's paper, entitled "When Academic Disagreement Becomes Harassment and Persecution," appeared on Boaler's Stanford website. A month ago, Milgram posted his response and alerted me to it. I have his permission to share parts of it here.

Entitled Private Data - The Real Story: A Huge Problem with Education Research, this second paper reviews Milgram et al's earlier critiques and adds several compelling updates. Together, the two papers make a series of highly significant points, all of them backed up with transparent references to data of the sort that Boaler and Staple's own paper completely lacks.

Indeed, among Milgram et al's points is precisely this lack of transparency. Boaler and Staples refuse to divulge their data, in particular data regarding which schools they studied, claiming that agreements with the schools and FERPA (Family Educational Rights and Privacy Act) rules disallow this. But FERPA only involves protecting the school records of individual students; not those of whole schools. More importantly, refusals to divulge such data violate the federal Freedom of Information Act. Boaler's refusal also violates the policies of Stanford University, specifically its stated "commitment to openness in research" and its prohibitions of secrecy, "including limitations on publishability of results."

Second, Milgram et al's examination of the actual data, once they were able to track it down via California's education records, shows that it was distorted in multiple ways.

1. Boaler and Staple's chosen cohorts aren't comparable:
It appears, from state data, that the cohort at Railside [the pseudonym of the Reform Math school] was comprised of students in the top half of the class in mathematics. For Greendale, it appears that the students were grouped between the 35th and 70th percentiles, and that the students at Hilltop were grouped between the 40th and 80th percentiles. [Excerpted from Milgram; boldface mine]
2. Boaler and Staple's testing instruments are flawed:
Our analysis shows that they contain numerous mathematical errors, even more serious imprecisions, and also that the two most important post-tests were at least 3 years below their expected grade levels.  [Excerpted from Milgram; boldface mine]
3. The data comparing test scores on California's standardized tests (STAR) comes from a comparison of test scores from students not involved in Boaler and Staple's study:
The students in the cohorts Boaler was studying should have been in 11th grade, not ninth in 2003! So [this] is not data for the population studied in [Boaler and Staple's paper]. This 2003 ninth grade algebra data is the only time where the Railside students clearly outperformed the students at the other two schools during this period. There is a possibility that they picked the unique data that might strengthen their assertions, rather than make use of the data relevant to their treatment groups.   [Excerpted from Milgram; boldface mine]
4. The most relevant actual data yields the opposite conclusion about the Reform Math cohort's mathematical success relative that of the traditional math cohorts:
o The most telling data we find is that the mathematics remediation rate for the cohort of Railside students that Boaler was following who entered the California State University system was 61%
o This was much higher than the state average of 37%
o Greendale's remediation rate was 35% o and Hilltop's was 29%.
5. School officials at "Railside" report that the results of the reform math curriculum are even worse than Milgram et al had originally indicated:
A high official in the district where Railside is located called and updated me on the situation there in May, 2010. One of that person's remarks is especially relevant. It was stated that as bad as [Milgram et al's original paper] indicated the situation was at Railside, the school district's internal data actually showed it was even worse. Consequently, they had to step in and change the math curriculum at Railside to a more traditional approach.

Changing the curriculum seems to have had some effect. This year (2012) there was a very large (27 point) increase in Railside's API score and an even larger (28 point) increase for socioeconomically disadvantaged students, where the target had been 7 points in each case.
6. Boaler’s responses to Milgram et al provide no substantiated refutations of any of their key points

In response to comments on an article on Boaler's critique of Milgram, Boaler states:
"I see in some of the comments people criticizing me for not addressing the detailed criticisms from Milgram/Bishop. I am more than happy to this. [...] I will write my detailed response today and post it to my site."
However, as Milgram notes in his December paper:
As I write this, nearly two months have passed since Boaler's rebuttal was promised, but it has not appeared. Nor is it likely to. The basic reason is that there is every reason to believe [Milgram et al's paper] is not only accurate but, in fact, understates the situation at "Railside" from 2000 - 2005.
In a nutshell: under the mantle of purported FERPA protection, we have hidden and distorted data supporting a continued revolution in K12 math education--a revolution that actual data show to be resulting, among other things, in substantially increased mathematics remediation rates among college students. Ever lower mathematical preparedness; ever greater college debt. Just what our country needs.

Nor is Boaler's Reform Math-supporting "research" unique in its lack of transparency, in its lack of independent verification, and in its unwarranted impact on K12 math practices. As Milgram notes,
This seems to be a very common occurrence within education circles.

For example, the results of a number of papers with enormous effects on curriculum and teaching, such as [Diane Briars and Lauren Resnick's paper "Standards, assessments -- and what else? The essential elements of Standards-based school improvement"] and [J. Riordan and P. Noyce's paper, "The impact of two standards-based mathematics curricula on student achievement in Massachusetts"] have never been independently verified.

Yet, [Briars and Resnick's paper] was the only independent research that demonstrated significant positive results for the Everyday Math program for a number of years. During this period district curriculum developers relied on [Briars and Resnick's paper] to justify choosing the program, and, today, EM is used by almost 20% of our students. Likewise [Riordan and Noyce's paper] was the only research accepted by [the U.S. Department of Education's] What Works Clearinghouse in their initial reports that showed positive effects for the elementary school program ``Investigations in Number, Data, and Space,'' which today is used by almost 10% of our students.
As Milgram notes:
Between one quarter and 30% of our elementary school students is a huge data set. Consequently, if these programs were capable of significantly improving our K-12 student outcomes, we would surely have seen evidence by now.
And to pretend that such evidence exists when it doesn't is nothing short of educational malpractice.