Instead, my concern is for the alarm-ringing, flag-waving, media-friendly BS that passes for scholarship these days.
The study, as you might guess from the title, purports to be about the "fact" that in their study of "undergraduates at two dozen universities, [they] concluded that 45 percent “demonstrated no significant gains in critical thinking, analytical reasoning, and written communications during the first two years of college.” "
I kid you not.
Why isn't this cause for concern, you may ask? Why am I not worried that nearly half of undergrads aren't improving "significantly" (whatever that means) their "broad analytic and problem-solving skills" in the first two years of undergraduate study?
You're here reading, so I'll assume you want me to tell you.
First, the study was done on 2300 undergrads at 24 institutions. Let's round that up to 2400 to be generous, and to make the math easier. That means that on average, at each of two dozen campuses, they studied 100 students. Not a bad sample size, all things considered. Except that at my undergraduate institution, that would be 0.2% of the population. I don't know if they take into account the differences between type of institutions (Community Colleges, R1s, SLACs, etc.) but I should like to know what statistical differences they found between them, as well.
Second, the study was done using the "Collegiate Learning Assessment":
...a standardized test that is essay-based and open-ended. (It is worth noting that in measuring broad analytic and problem-solving skills, the exam does not assess how much students concentrating in particular majors — physics or psychology, for example — have learned in their respective fields of study.)
It's a standardized test. Standardized tests do not, in fact, measure anything except the ability of a given student to take that particular standardized test. As far as I'm concerned, the sooner this country learns that, the better. Moving on, that little caveat (you know, that minor bit about not measuring the development of field-specific skills?) isn't so little.
One of the hardest parts of undergraduate learning is the steep difference in standards between even well-performing high schools and first-year university classes. There's a reason that most students can expect to have their marks drop by a full letter grade when they first enter undergraduate study: it's harder. That the first two years are spent trying to get up to speed in a new environment (both academic and social) alone would be enough to account for the fact that the lower-performing 45% of the students taking a standardized test don't show "significant" improvement in broad analytical skills.
That doesn't even take into account that a huge number of undergraduate programs are specialized from the get-go, meaning that the first two years are the first two years of a psychology program, or of a business program, or of an electrical engineering program -- where those years are spent, specifically, on learning to operate in a specific discipline.
Maybe a student in second-year economics won't see much improvement in broad analytical skills just yet, but I'll bet s/he'll see a pretty major improvement in her/his ability to understand economic systems.
The study purports to address this problem that "almost no one asks the fundamental question posed by Academically Adrift: are undergraduates really learning anything once they get [to college]? For a large proportion of students, Richard Arum and Josipa Roksa’s answer to that question is a definitive no." (From the University of Chicago Press site)
Are they learning anything? Well, they aren't getting better on this one standardized test. In the first two years. Well, a little under half of them aren't. And we didn't take into account the time it takes to adapt to a new learning environment, or that what they might be working hardest on isn't broad analytical skills, but rather discipline-specific skills we don't measure. So I guess our answer is no, they aren't learning anything.
Face, meet palm.
Of course, this can all be fixed (because it's broken, don't you see?) by having students do more work:
for example, they found that 32 percent of the students whom they followed did not, in a typical semester, take “any courses with more than 40 pages of reading per week” and that 50 percent “did not take a single course in which they wrote more than 20 pages over the course of the semester.”
My God, a third of students don't take courses that require more than 40 pages of reading a week? And fully half didn't write more then twenty pages in a single class in a single semester? Sacre Dieu! It's like half of the students they polled are in the sciences or something.
Of course it's the institutions' fault. The study asks "What if at the beginning of the 21st century many colleges and universities were not focused primarily on undergraduate learning, but instead had become distracted by other institutional functions and goals?"
Yes. What if.
I have another "what if" -- what if, at the beginning of the 21st century, academics were given financial incentives to pass off this kind of alarmist pap as significant research?
This is, from what I can tell, a poorly put-together argument that loosely associates poor research findings with scapegoating accusations. It begins, like all bad science, with an intended outcome and a reason for it (read: our schools are failing us and it's Your Fault) and then uses water-thin evidence to attempt to substantiate its grand sweeping claims.
If I'm completely off-base about this book, then I wholeheartedly apologize. At this point I can only operate based on the claims made about the book by the NYT and UCP. But if I'm wrong in my assessment, and, in fact, this is a well-written study whose authors understand all of the above and have included caveats throughout explaining such... well, if I were professors Arum and Roksa, then I would be positively livid -- not only with the New York Times for painting my study in such a ridiculous light, but also with the University of Chicago Press for the patently absurd copy accompanying the book.
Why do we need to always be alarmed by the state of education today?
3 comments:
I tend to agree with your skepticism, here. So many reports (even the CHE) seem to be confirming your reading of the argument, and the CHE in-depth piece shows some interesting push-back from professors and others involved in writing education: http://chronicle.com/article/Writing-Assignments-Are-Scarce/125984/
I attended an institution which used its score on the CLA to claim that "[Insert School Here] is number #1 in how much students learn over four years!" It is true that my institution was a rather poor one, but it's also true that the CLA is a terrible test. It has all the problems of regular standardized tests--it only tests students' abilities to take tests, students know it has no impact on their GPA--but it is also bad as a standardized test. It may have improved over the past several years, but when I took it, the test itself (which is internet-based), malfunctioned frequently. Some of us found that the site shut down when we were partway through and awarded us our score as if we had simply failed to complete the test. It is also incredibly abstract, even asking questions of morals. It's true that I went to a rather poor school, but I had a few good professors--and, as I'm in a ranked graduate program, I must have learned something--but the CLA is definitely a poor indicator of school quality, good or bad.
I would like to suggest that the study is also done on certain schools within the country you currently live in... and that your undergrad experience is not that of the country you currently live in ;)
Post a Comment