Saturday 4 May 2013

Working for a Living: Musings on the Working Life in the Belly of the Educational Bureaucracy...

Here is an email, dear readers, I recently wrote to one of the statisticians at Evaluations at SUNY about SPI's, SPI's being in SUNYspeak, Student Perception Indicators...

I usually don't look at the data on SPI's because it is fatally flawed. The only reason I am requesting some of this data at this time is because the Department I teach in now requires it.

There are a number of theoretical and methodological problems with these "indicators". One fundamental problem is that it is grounded in a retail model of evaluation, the customer is always right model. There are a number of problems with such a model. It doesn't deal with the competencies of consumers for instance. Another major problem is that you can't simply look at the retail side of things. You have to compare and contrast it to the items for sale. You need, in other words, to look at retail comments in relationship to what was sold, the syllabus, in other words, which, by mandate, contains our objectives (I forget the fancy name SUNY gives them) and our lesson plan.

The nice thing about Digital Measures (which I had to use at RPI when I taught there)--which is why I am recommending their use here at SUNY and the expenditure on them if you want to continue with this approach--is that you have to put your objectives and how you tried to reach them in DM's. This allows the analyst to look at both the sales side and the retail side. I think you can also put a link to your syllabus as well, the syllabus that many consumers don't or only lightly read these days for some reason (the Michelle Rhee School of Education, i.e., teaching for the test, which involves doing everything for the consumer).

While using Digital Measures is a vast improvement on the consumer only approach because it adds the sales side of the ledger to the equation, there are still a number of problems with "teacher evaluations" but that is another and much longer story. Suffice it to say statistics are all surface and no depth. And that is one of the problems.

This email was predicated on the fact that as an adjunct at SUNY I now have to to access and record grade breakdown percentages and the means for the consumer responses from the SPI's, the Student Perception Indicators, SUNY collects at the end of each term for each class, at least theoretically. I needed this data for the new and improved will you be rehired in the Department expanded FAR forms we adjuncts are now required to do. And boy do I enjoy doing more work for the same pittance as before when we didn't have to do this significantly and substantially expanded bureaucratic work. We only had to do FAR's in SUNYspeak, Faculty Activity Reports, which, since they were electronic, were relatively quick and painless to do once you had the basic data in. Anyway, I couldn't unzip the online versions of the SPI's from home on my Mac and I was annoyed by the far too many step process of accessing them in the first place. Using these things should be easy not only for tech heads but those with limited tech experience like me. The reason so many steps are required to access this data is because SUNY you see, does this in house making accessing this data more complicated than usual because they are starting from scratch and using relatively primitive software in order to do it.

There are programmes out there, as i mentioned, like Digital Measures, which, if SUNY wanted to spend the money, make accessing statistical material like the SPI's relatively easy, almost as easy as opening an email programme. Additionally, DM contains objectives (SLOs in SUNYspeak) from syllabi making comparison between these and student perceptions possible for evaluators with relative ease, something any analyst worth his or her analytical salt should do automatically and which should be de rigueur in the teacher evaluation "profession" despite how much more work it would require the analyst to do.

I don't keep the paper copies of the SPI's SUNY gives us. The major reason, apart for a place to put them--the campus digs for adjuncts is usually a room that we inhabit with a bunch of other adjuncts and a few ancient, and I mean ancient, computers, I don't keep them is, as I noted, that I don't find them particularly helpful in my teaching.

One of the reasons, one I touched on earlier, I don't find SPI's helpful is that they don't counterpoint consumer perceptions with the objectives in teacher syllabi and lesson plans, both of which reveal the logic and intentions behind our classes and should be mandatory since they act as a check and balance on student perceptions.

What I find most interesting about SPI's, I am a student of human history and its culture after all, is what they reveal about many college students today. Before I get to these, however, I want to talk a bit about the measures themselves. There are several theoretically and methodologically questionable assumptions at the heart of SPI's and teacher evaluations in general. They seem to assume that students of college going age are tabula rasa, they don't, after all, collect data on student economic, political, cultural, and educational backgrounds, an assumption Locke never made. Locke argued only that humans were born with a blank slate. Apparently the people who put together these measures think that students are wet sponges just waiting to sop up knowledge in the courses, even the general education courses they are required to take. Apparently they think that there is no "noise", nothing getting in the way, of student's ability to rationally analyse a class by checking objectives in the syllabus (we are mandated to put them there) against the structure and content of the course, things like competency, ideology, or even reading the syllabus. There are, however, a number of problems with this assumption. Empirical audience analysis over the years has shown that cultural contexts, cultural ways of seeing, cultural ways of perceiving, cultural ideologies, impact how one "reads" the social and cultural stuff of of the human world. Audience analysis, in other words, tell us as much if not more about the reader than about that that which they are reading. Humans are more Bender than Data before he got emotions as corporations and advertisers but apparently not economists know. Additionally, one can raise valid questions about whether 18, 19, 20, and 21 year olds have the academic competencies to judge a class put together by someone with years of academic experience just as one can raise questions as to whether young people are fully competent to debate, in an analytical and systematic way, the ethics and morality of the wars they are often called on to fight. Age and the experience it can bring is the great unequaliser. And then there is the fact that more and more students don't seem to be, if SPI's and observations I have made in classes over the years are a guide, reading the syllabus on a regular basis and using that syllabus as a guide for their sojourn through the class anymore.

Syllabitis has been a problem in every class, lower division and upper division, I have taught since at least 2007. Putting syllabi and lesson plans online where they are easily accessible hasn't, by the way, improved this situation. It has actually made it worse. The post computer generation isn't necessarily computer or World Wide Web competent. The worst case of syllabitis I ever saw in one of my classrooms was in a Journalism class I taught at Albany State University in 2009. A number of the students, I don't know exactly how many though it was, I gathered, more than one or two, in this class said that I did not tell them when assignments were due, what assignments were worth in terms of grade points, and what the grading breakdown was for each assignment. And they took their "concerns" to the chair of the Department of English, the department that housed Journalism classes. The fundamental problem with these claims, as the Department Chair and Journalism head found out when I was called in to address these "concerns" was that all of these things these students said were empirically false. All of these things were clearly addressed in the syllabus.

So how did these students miss what the syllabus said about all of these things? They can't read English? Don't think so. They can't comprehend written English? Don't think so. They didn't recall the syllabus and didn't bother to return to it to see what the answer to their questions was despite the fact that it was online and accessible from any computer in any place around the globe with access to the internet? Perhaps. They didn't read the syllabus? Perhaps. Regardless of the reason or reasons I think this tells us something about students, student backgrounds, and the ever increasing work that teachers have to do these days beyond the putative subject matter of their college classes which suffers as a result. Today I not only have to teach students how to write a paper, how to use a computer, how to set up a blogger account, and how to use a syllabus and the importance of referring to a syllabus throughout the course of the term. This problem, by the way, is not only a problem in public universities like SUNY Albany and SUNY Oneonta but also a private college like RPI with its more selective students.

I attribute syllabitis (and the problematic teacher evaluation measures we use these days), in part, to the Michelle Rhee School of Educational Dogma that emphasises teaching for the test, a strategy which leads to, even in high school where we should not only be teaching substance in a more interdisciplinary fashion but also teaching students independence, including intellectual independence, just the opposite. Instead what takes place in high school, as far as I can tell, is a situation in which students are told what they need to do by teachers verbally on a daily basis and basically told that education is simply an automatonic regurgitation of the "facts" man or ma'am. Education today attempts, in other words, to turn students into dependent robots. Their ability to regurgitate "facts" or not on "tests" are seen by those pushing the Michelle Rhee educational theology as an indication of how well the teacher is doing. The teacher as scapegoat. I suspect that student expectations are coloured by this regurgitation model of education when they matriculate in college and so they don't understand the importance of the syllabus in college education and they expect the same thing in college. As a result I now have to go through almost every inch of the syllabus during class time, something I didn't have to do in the past, in order to try to get students to understand the importance of the syllabus. I routinely mention to students the importance of the syllabus throughout the course of the class. Hardly a week goes by that I don't have to remind students that what they are asking about is in the syllabus. I have even made students sign a statement that they read and understand the syllabus in the past. I will let you decide what this says about college students today and what it is like teaching college students in colleges today. I will also let you decide as to whether robotic regurgitation should be at the heart of a liberal arts "higher" education.

Suffice to say that the teaching for the test regurgitation model of education is, in my opinion, undermining the very essence of a liberal arts education, namely, critical thinking, critical independence, telling what is rot from not rot. It may be fine for education in grades 1 through 4 or 5 but it is not what a liberal arts higher education should be. Unfortunately, the Michelle Rhee model of education is becoming the way higher education in America is working year after year after year. C'est la vie dans l'académie.

No comments:

Post a Comment