Education

by Jennifer Hurley 

When I read an essay with a rubric attached, I read with an evaluative mind, looking for where the student had succeeded and or not. But when I read an essay without a rubric attached, I read with curiosity about what the student had to say. I engaged more with the ideas in the essay, and my comments reflected that. Some of my feedback was evaluative, but it was more with the goal of helping students find their best ideas and express them more powerfully.

Ditching my rubrics freed me up to make the kind of comments that could most help my students. I could make observations that had no judgment attached. I could tell the student where I cheered for them and where I was puzzled. I could appreciate specific parts of an essay without worrying how it connected or didn’t connect with the rubric. I could notice what was unique about that student’s writing or make connections to the student’s previous work. I could offer ideas for how the student could expand or pose questions to get them thinking more. I could ask students to respond back to me on a particular issue, thereby starting a dialogue. I could tell my students how I personally connected with what they wrote, which built their trust in me. Most important of all, I could show through my feedback that my students’ ideas were heard — that I cared about what they had to say. I could give my students a reader — not a judge, not a critic, but a reader.

via Alfie Kohn
by Alfie Kohn 

I eventually came to understand that not all alternative assessments are authentic.  My growing doubts about rubrics in particular were prompted by the assumptions on which this technique rested and also the criteria by which they (and assessment itself) were typically judged.  These doubts were stoked not only by murmurs of dissent I heard from thoughtful educators but by the case made for this technique by its enthusiastic proponents.  For example, I read in one article that “rubrics make assessing student work quick and efficient, and they help teachers to justify to parents and others the grades that they assign to students.” To which the only appropriate response is: Uh-oh.

First of all, something that’s commended to teachers as a handy strategy of self-justification during parent conferences (“Look at all these 3’s, Mrs. Grommet!  How could I have given Zach anything but a B?”) doesn’t seem particularly promising for inviting teachers to improve their practices, let alone rethink their premises.

Second, I’d been looking for an alternative to grades because research shows three reliable effects when students are graded:  They tend to think less deeply, avoid taking risks, and lose interest in the learning itself. The ultimate goal of authentic assessment must be the elimination of grades. But rubrics actually help to legitimate grades by offering a new way to derive them.  They do nothing to address the terrible reality of students who have been led to focus on getting A’s rather than on making sense of ideas.

by Caitlin Cassidy in The Guardian  

More than a dozen academics and students who spoke to Guardian Australia, most on the condition of anonymity, said the universities’ financial reliance on foreign students over many years had hollowed out academic integrity and threatened the international credibility of the sector.

Many said the rise of artificial intelligence was accelerating the crisis to the point where the only way to fail a course would be to hand nothing in, unless universities came up with a coherent institutional response.

A tutor in an arts subject at a leading sandstone university said in recent years the number of overseas students in her classes – who may pay up to $300,000 in upfront costs – had reached as high as 80%.

“Most can’t speak, write or understand basic English,” she said. “They use translators or text capture to translate the lectures and tutorials, translation aids to read the literature and ChatGPT to generate ideas.

“It’s mind blowing that you can walk away with a master’s degree in a variety of subjects without being able to understand a sentence.”

via quackademic
by Alfie Kohn 

A remarkable body of research over many years has demonstrated that the sort of teaching in which students are provided with answers or shown the correct way to do something — where they’re basically seen as empty receptacles to be filled with facts or skills — tends to be much less effective than some variant of student-centered learning that involves inquiry or discovery, in which students play an active role in constructing meaning for themselves and with one another.

[
]

Now put yourself in the place of one of those hard-liners who want teachers to remain the center of gravity in the classroom, disgorging information. How might you circle the wagons despite all the research that undercuts your position? Even more audaciously, how could you try to get away with saying DI is “evidence-based” or supported by the “science of learning” — a favorite rhetorical gambit of traditionalists?

To the rescue comes an idea called cognitive load theory (CLT). This concept, primarily associated with an Australian educational psychologist named John Sweller, basically holds that trying to figure things out for yourself uses up so much working memory that too little is left to move whatever has been learned into long-term memory. It’s therefore more efficient for the teacher just to show students problems that have already been worked out correctly or provide them with “process sheets” that list step-by-step instructions for producing the right answer. (Imagine Jack Nicholson as the cognitive load theorist, hollering at students, “Inquiry? Your brain can’t handle inquiry!”)

by Gerald Coles in CounterPunch  

By blaming poor children’s school learning failure on their brains, the billionaires are continuing a long pseudoscientific charade extending back to 19th century “craniology,” which used head shape-and-size to explain the intellectual inferiority of “lesser” groups, such as southern Europeans and blacks. When craniology finally was debunked in the early 20thcentury, psychologists devised the IQ test, which sustained the mental classification business. Purportedly a more scientific instrument, it was heavily used not only to continue craniology’s identification of intellectually inferior ethnic and racial groups, but also to “explain” the educational underachievement of black and poor-white students.

After decades of use, IQ tests were substantially debunked from the 1960s onward, but new, more neurologically complex, so-called brain-based explanations emerged for differing educational outcomes. These explanations conceived of the overall brain as normal, but contended that brain glitches impeded school learning and success. Thus entered “learning disabilities,” “dyslexia,”and “attention deficit hyperactivity disorder (ADHD)” as major neuropsychological concepts to (1) explain school failure, particularly for poor children, although the labels also extended to many middle-class students; and (2) serve as “scientific” justification for scripted, narrow, pedagogy in which teachers seemingly reigned in the classroom, but in fact, were themselves controlled by the prefabricated curricula.

by Carol Black 

Why is it clear to us that it's degrading and objectifying to measure and rank a girl’s physical body on a numeric scale, but we think it’s perfectly okay to measure and rank her mind that way?

Over the years I've watched the many ways that children try to cope with the evaluative gaze of school. (The gaze, of course, can come from parents, too; just ask my kids.) Some children eagerly display themselves for it; some try to make themselves invisible to it. They fight, they flee, they freeze; like prey animals they let their bodies go limp and passive before it. Some defy it by laughing in its face, by acting up, clowning around, refusing to attend or engage, refusing to try so you can never say they failed. Some master the art of holding back that last 10%, of giving just enough of themselves to "succeed," but holding back enough that the gaze can't define them (they don't yet know that this strategy will define and limit their lives.) Some make themselves sick trying to meet or exceed the "standards" that it sets for them. Some simply vanish into those standards until they don't know who they would have been had the standards not been set.

[
]

When a child does something you can't understand, something that doesn't make sense, when they erupt into recklessness, or fold up into secrecy and silence, or short-circuit into avoidance, or dissipate into fog and unfocus, or lock down into resistance, it's worth asking yourself: are they protecting themselves from the gaze?

by Patrick Culbert 

With appropriate support and mentorship, most Ph.D. students possess motivation independent of grades. Though it may vary with the highs and lows of the research process, this motivation is (mostly) related to students’ intrinsic interest in their field and topic of study. (Though the recognition afforded by the degree itself also serves as extrinsic motivation for most). Even without the grade, graduate students work hard. They re-do, revise, and revise some more. They seek out feedback from others as they work to improve their abilities and understanding. They pursue intriguing ideas and approaches, and while it may be a frustrating waste of time when some of those don’t pan out, there is no explicit penalty. They are empowered to take risks. They are taught that failure is normal, even if it is painful. They learn that an experiment with an unexpected result is often what leads to new questions, insights, or even breakthroughs. When they graduate and enter the job market, especially academia, potential employers will evaluate them based on their body of work and narrative assessments of their abilities (recommendation letters).

Those statements above seem almost silly when applied to a dissertation, but statements like these are typical justifications of the necessity of grading undergraduates. Why are our views on grades for undergraduates often in direct opposition to how we might perceive grading dissertations?

by Finn Mackay in The Guardian  

In December, five years later than promised, the Tories finally delivered draft, non-statutory guidance for schools on “gender questioning children”. It provoked criticism and concerns from all sides, and is open for consultation until March. But whatever its final form, one aspect of the guidance has gone largely unnoticed.

The document doesn’t tell us anything we don’t already know about this government’s hostile stance on trans identities, inclusion and rights; but, unfortunately, what it does do is further solidify in official documentation and language the politicised phrase “gender identity ideology”. The government is attempting to bring into the mainstream this contested term, a creation of rightwing sex and gender conservatism that dates back to the 1990s, and which forms a key part of renewed attacks against the LGBTQ+ community.

As used in this context, the phrase “gender identity ideology” is actually nothing to do with gender, as in masculinity and femininity, and how this shapes our identities. Instead, it is used to imply that trans, transgender and gender non-conforming identities are a new fad, and that the longstanding social justice movement for trans rights is really a recent conspiracy of nefarious elites.

The use of terms such as “gender identity ideology”, “gender identity” and “social transition” serve to obscure the ideology of gender that members of this government, like all sex and gender conservatives, merrily adhere to themselves, and enforce on us all. Gender ideology is real, but it wasn’t invented by trans men or trans women, and it doesn’t just apply to trans or transgender people. The real gender ideology is the binary sex and gender system that requires all of us to be either male-masculine-heterosexual or female-feminine-heterosexual; and which attaches harsh penalties to those who deviate from this script. Almost all of us will have been socialised on to pink or blue paths from birth, if not by our immediate family, then by the books, TV, toys, clothes and adverts that surrounded us in wider society. This socially prescribed gender informs our gender identity.

by Jason Hickel 

Capitalism relies on maintaining an artificial scarcity of essential goods and services (like housing, healthcare, transport, etc), through processes of enclosure and commodification. We know that enclosure enables monopolists to raise prices and maximize their profits (consider the rental market, the US healthcare system, or the British rail system). But it also has another effect. When essential goods are privatized and expensive, people need more income than they would otherwise require to access them. To get it they are compelled to increase their labour in capitalist markets, working to produce new things that may not be needed (with increased energy use, resource use, and ecological pressure) simply to access things that clearly are needed, and which are quite often already there.

Take housing, for example. If your rent goes up, you suddenly have to work more just to keep the same roof over your head.  At an economy-wide level, this dynamic means we need more aggregate production — more growth — in order to meet basic needs.  From the perspective of capital, this ensures a steady flow of labour for private firms, and maintains downward pressure on wages to facilitate capital accumulation. For the rest of us it means needless exploitation, insecurity, and ecological damage. Artificial scarcity also creates growth dependencies: because survival is mediated by prices and wages, when productivity improvements and recessions lead to unemployment people suffer loss of access to essential goods — even when the output of those goods is not affected — and growth is needed to create new jobs and resolve the social crisis.

There is a way out of this trap: by decommodifying essential goods and services, we can eliminate artificial scarcity and ensure public abundance, de-link human well-being from growth, and reduce growthist pressures.

by Lewis Powell 

No thoughtful person can question that the American economic system is under broad attack. This varies in scope, intensity, in the techniques employed, and in the level of visibility.

[
] 

The sources are varied and diffused. They include, not unexpectedly, the Communists, New Leftists and other revolutionaries who would destroy the entire system, both political and economic. These extremists of the left are far more numerous, better financed, and increasingly are more welcomed and encouraged by other elements of society, than ever before in our history. But they remain a small minority, and are not yet the principal cause for concern.

The most disquieting voices joining the chorus of criticism come from perfectly respectable elements of society: from the college campus, the pulpit, the media, the intellectual and literary journals, the arts and sciences, and from politicians. In most of these groups the movement against the system is participated in only by minorities. Yet, these often are the most articulate, the most vocal, the most prolific in their writing and speaking.

Moreover, much of the media-for varying motives and in varying degrees-either voluntarily accords unique publicity to these “attackers,” or at least allows them to exploit the media for their purposes. This is especially true of television, which now plays such a predominant role in shaping the thinking, attitudes and emotions of our people.