Showing posts with label television. Show all posts
Showing posts with label television. Show all posts

Friday, May 10, 2013

“Open hailing frequencies, Mr. Worf.”

A year or so ago, I applied for a job, an administrative position, at a nearby private school.  This was a position I really wanted;  it wasn’t just a job for the purpose of receiving a salary and benefits (though the benefits – including tuition waivers for my children – were a major inducement).  My two decades of professional preparation in education seemed to have prepared me for this exact position, almost as if its appearance in the online want ads was preordained.

Does anybody still call them “want ads” anymore?  But I digress…

I submitted my vita, cover letter and a number of references.  And I was quite gratified when, from what I can only imagine must have been a national pool of candidates, I was contacted about setting up an interview.  This was good news, though it was not without its special set of stressors.  As a “corpulent American” (are we a protected class of citizen?) I am incredibly aware of the fact that clothing, even proper, businesslike formalwear, does not sit on me the way it does on, say, George Clooney.  My body makes the fabric do things it normally wouldn’t, and probably, if you asked it, would not wish to.  That, plus, men are generally expected to wear ties.

Oh, dear.

I love ties.  Every time I’m in Macy’s (which happens often, as I have to walk through it to get to the stores in the mall that I actually shop at), I hover droolingly over the Jerry Garcia ties, wishing: a.) that I could afford them; and, b.) that my neck were more human-sized.  For even if I spent $200 on eight gorgeous Garcias, I would not have the shirts to wear them with.  Oh, I have plenty of shirts; all of my shirts have 18-½ or 19 inch necks.  And the shirts fit me fine, relatively speaking.  Except up there.  For me to wear a tie requires a shirt with a 20-inch neck, and that means specialty stores, and that means $50-60 for a shirt, and those are the sale prices.

To put this in perspective, an adjunct college instructor in upstate New York who teaches a full load – three classes in the fall, three in the spring – grosses, if he or she is lucky enough to land at a better-paying institution, around $23,000.  Extra-duty assignments (tutoring, paid exam grading, etc…) can bump that up to around $30,000.  There are no benefits.  Adjuncts do not get asked to teach summer sessions.  Sixty-dollar shirts ain’t gonna happen.  Ixnay on the Arciasgay.
 
But then, on top of that, I found out that the interview was going to be a Skype interview.  My first.

The concept is simple enough.  Every Star Trek episode ever, where captain Kirk, or Picard, or whoever the spin-off shows’ captains were (I gave up after a while) barks “On screen!” to have a video chat with some alien life form aboard another ship – that’s Skype.  It’s a pretty simple principle.  And aside from the fact that the starship U.S.S. Enterprise was never cursed with Windows Vista, it should work pretty much the same – smooth, seamless and natural, like talking to a person sitting across the table from you.

Not so much.  Webcams are small, with a very limited field of vision. I was interviewed by a committee, which meant that they had to, whenever a different person wanted to question me, physically lift and re-orient the webcam so that I could see that person.  It’s also unusual to have such artificially limited feedback; not only is the field of vision limited, but the resolution, both audio and video, is less than optimal.  There is a lot of subtle information, signals and other cues –subvocalizations, body language, proxemics, not to mention whatever may be going on off-webcam – that gets missed on Skype.
 
But the worst part is that humans, when speaking with other humans, like to look each other in the eye, and it seems that no matter what you do with Skype, you can never actually achieve this.  If you look straight into the webcam, you cannot track the facial expressions of the person to whom you are speaking.  If you look straight into the monitor, you are looking the person in the (virtual) face, yes, but your own face will show up as looking somewhat askew or aslant.  (When I catch my own image in the inset picture myself and see that, I tend to subconsciously make a slight corrections to “straighten myself out,” and this of course has the effect of making it even worse.)

For me, it all heightened the anxiety of the experience.  And I hate most interviews, even on a good day.  I find them trite and formulaic. Even interviewers I’ve spoken with hate the generic boilerplate questions that they are directed to ask.  The format itself is in no way an organic conversation, and is of limited value in assessing the worth of a candidate.  Skype makes this even worse, for me anyway, and I think in my case, it shows.  This does not bode well for my prospects with any would-be employer that wishes to interview me via Skype.  (If you are a would-be employer, please disregard this paragraph.  These aren't the sentences you're looking for.)

I get it. I know why employers do this.  It’s not to be trendy for the sake of trendiness; let’s leave that to grade-school educators and their stupid keep-up-with-the-Joneses mentality towards tech adoption.  In a national search, you simply have to extend the same privileges to all of your applicants equally.  I lived only 20 miles away from the school, but other applicants may well have been in other states, or even other countries.  It would not have been fair to offer me a face-to-face interview, and not offer it the other candidates.  And these days, in this economy, I suspect only schools with 9+-figure endowments are willing to pay to fly in interview candidates.  No hard feelings.  "Bygones," as Ally McBeal's  Richard Fish would have said.

I did not get the job.  Was it because there was a more qualified candidate?  Sure, that’s a possibility.  I’m good, but I’m not so hubristic as to think that I de facto trump anyone that a national search would turn up.  (Yeah, I kind of am.)  All I’m saying is that the interview didn’t help.

Maybe Q (of Star Trek: The Next Generation fame) had a point:
"If you can't take a little bloody nose, maybe you ought to go back home and crawl under your bed. It's not safe out here. It's wondrous, with treasures to satiate desires both subtle and gross. But it's not for the timid."
Fast forward to this year.  One of my colleges announced openings for two tenure-track positions in my department.  I love this particular school (anybody on the Committee reading this?) and am desperately hoping I get one of the two positions.  I left teaching in secondary schools two years ago, and if I can afford it, I do not expect to go back.  (Click and read here for more on that choice.)  English, however, is a nasty discipline in which to try to find full-time work.  In the nine or so years that I have been teaching and applying for college English positions, I have learned that it is perhaps the most competitive of the disciplines, often with well over 100 CVs submitted for each tenure-track position at the community college level.  Time for that hubris to kick in.

When I was informed that I had been selected for a first-stage interview, I was ecstatic. (I had applied last year, and the year before, and not even been granted a preliminary interview!)  I have no numbers, but I’m guessing they would have whittled down 200 applicants to perhaps 20-30 for that first round.  Then I got the email:  “Please send the Committee your Skype contact information and select one of the available time slots.”

To quote “Ashes to Ashes” by David Bowie, “Oh no, not again.”

The interview was, in my opinion anyway, a train wreck.  For all of the reasons I mentioned above, and more.  However, I made it through to the next round, a fact I credit to the various Jedi mind tricks I surreptitiously employed to bolster my chances.  (Yes, I know, I pulled an Obama, crossing over between Star Trek and Star Wars.  What of it?  While I'm on the subject, however, do any Star Wars geeks out there know if Jedi are able to use their mind-control techniques over electronic communications media?)

The second round was a teaching demo.  I would imagine that perhaps half, or slightly fewer, of the Skype interviewees landed teaching demos, I’ll guess 8-12.  This is more my element.  I think I performed pretty well in the sample lesson, though I had to wait a week to find out I had made it through to the next round, which was a references check.  I was pretty sure that no one on my references list had any grudges or vendettas to fulfill against me, so I thought that would go well.  It took nine or ten days for me to for me to come home from work and find this message on my machine:  “Hi, Andrew, this is so-and-so from the Provost’s office.  We’d like you to contact us to set up an interview for the English position.”

Woo-hoo! Hot damn!

I just got off the phone with the Provost’s executive assistant a couple of hours ago. She tells me that it’s going to be a Skype interview.

*facepalm*  Damn you, Q.



Wednesday, May 8, 2013

From the emergency room to the classroom -- bias and prejudice in "treatment"

Recently, a nurse named Lou Davis published a piece on slate.com, answering the question of what it is like to treat (medically) a person who has committed heinous acts, crimes or atrocities.  The idea is to explore how our biases, beliefs and preconceived notions get in the way of our being able to perform in the manner in which we are tasked. 

Davis writes:
The care I give is, to all intents and purposes, exactly the same as I would give to anyone else. I would be as attentive, my standards are as high, I am as diligent in giving pain relief or in ensuring that the wounds I am suturing are done in exactly the same way.  Outwardly then, there is no difference. None at all. You could stand by my side and be able to tell no difference.   
Inwardly is rather different. Over the years, I have treated murderers, rapists, pedophiles. I have carried on working while some of these patients have made comments, been rude or abusive.  I know that my standards have never been compromised.  That isn't to say that I don't have feelings of anger or revulsion or even hate. But it isn't my job to show those feelings.
As a lifelong fan of the hour-long medical drama genre (Chicago Hope, ER, Grey’s Anatomy, House; Marcus Welby was a bit before my time), this is a sub-plot that has been played out many times: the rapist or murderer who has been wounded by his (or her) victim and is rushed to the emergency room for treatment, where there is always at least one doctor or nurse who expresses his or her anguish at having to treat such a despicable person.  An example is ER, season 4, episode 13, “Carter’s Choice,” synopsis courtesy of  fandango.com:
The hospital's blood supply runs out during a devastating blizzard, forcing the doctors to make literal life and death decisions. Should [Doctor] Carter (Noah Wyle) try to save the life of a seriously wounded serial rapist, or should the blood go to someone more "deserving"?
A more extreme version of this same plot device was used in the House episode “The Tyrant,” in which the doctors are called upon to treat an ailing patient who turns out to be a genocidal African warlord:
When a controversial African politician (guest star James Earl Jones) falls ill, he is brought to Princeton Plainsboro [Hospital] for treatment. The team struggles with whether to help a merciless dictator being sued in the United States for crimes against humanity in his country.  (courtesy House Wiki)
Watch the full episode here.
I’d like to think that in real life, the decision is more cut-and-dry.  If you are a doctor or nurse, you treat the patient in front of you.  Ultimately, the course of action is perhaps best summed up in the words of advice given to Davis by a colleague: "Just think to yourself:  'We care for you, not about you' and it will make it easier."

In the classroom, however, the situation is somewhat different.  We do care, both for, and about, our students. We are tasked with acting in loco parentis, and our charge is not just imparting curriculum but, to a certain extent anyway (and we can agree to disagree about how much, as anyone who has read my posts will realize I am not a “progressive” in most ways) educating “the whole child.”  We do get emotionally involved with our children, we involve ourselves in their lives, we interact with their families, we live (in many cases) in their communities.

In other words, we cannot simply, to use a phrase common to my beloved medical dramas, “treat ‘em and street ‘em.”

In my classrooms, I have had violent kids, felons, drug dealers, hardcore gangbangers, kids with extensive records, and kids who have committed (or allegedly committed) horrific acts.  I have had students cuss me out, threaten me, even engage me physically. And I will admit that the knowledge of these things does change the way I feel about them, the way I look at them, the way I “inwardly,” to use Nurse Davis’s word, regard them.  But do I let it affect how I teach them?

I hope not.  But it’s possible, I’ll admit.  I am, as they say, only human.

I certainly grant occasional perks and privileges, special considerations if you will, to exceptional, hard-working, responsible and respectful students.  I certainly deny privileges to students who habitually behave in the opposite fashion.  And I believe, as do most sensible people, that the idea of giving awards and rewards to everyone in the room to avoid hurt feelings is just plain dumb, not to mention counter-productive.  But all this is in relation or in response to student actions and behaviors in my classroom and with either me or the other students.  That’s not the same as treating a student differently based on “prior bad acts,” to use the appropriate legal phrase (okay, I love the hour-long crime drama genre as well).

Teachers have a bad habit of both talking up and talking down students.  I’ve done it, as has every teacher I’ve ever worked with.  Sometimes, I can imagine that that “talking down” could lead, in a teacher of lesser integrity, to a situation where the students don’t get a fair shake from day one.  A hypothetical conversation:
Teacher A:  Hey, you got your class lists from next term?  Let me see.
Teacher B: Yeah, sure. Here.
Teacher A:  Oh, dang!  You’ve got Umfufu?  Oh man, I’m sorry. She’s a nightmare!  Last year, she ___________ [fill in the blank with whatever damning anecdotes you can dream up]. And when she was in intermediate school, she was suspended twice for ____________________ [ditto].
Now it’s up to Teacher B how, or whether, to utilize that information. 

If you’re teacher B, does that conversation linger in your mind when Umfufu steps into your classroom? When you take attendance for the first time, does that knowledge affect the way you interpret the look she gives you?  The first time she’s late for class, or fails to bring in homework, do you say, “Yeah, it figures.”  (Even if it’s just behind closed doors to your colleagues?  If you’re a teacher, I’m guessing you know EXACTLY what I mean.)

Is the teacher situation analogous to the doctor/nurse situation? What do you think of the advice given Nurse Davis by her colleague?  Am I right that such a maxim simply would not work in a school environment where caring is de facto part of the job?  Or do you think teachers actually should be more detached, professional, and less “involved” than they sometimes are?

I’m just A.S.K.ing…


Sunday, April 14, 2013

In Hartford, Hereford and Hampshire, hurricanes hardly ever happen!

If one more news personality places the article “an” before the word “historic” or “historical,” I’m going to have an hissy fit.

See?  Sounds pretty stupid, doesn’t it?

Editor Judy Vorfield succinctly sums up the situation here, where she writes: “Using ‘an’ is common, but not universally accepted by experts.”  A gentle and noncommittal way of saying it ain't technically correct.  In other words, the usage falls into that ugly and contentious prescriptivist (it’s against the rule, so don’t do it) versus descriptivist (it’s the way people really talk, so it’s de facto legit) pissing match.

"An historic change in leadership is taking place..."  Oy, ‘eaven ‘elp me…

First of all, "a," not "an," is used before a consonant-initial word. And let me be clear for the non-linguists out there... A consonant is not a letter, it is a sound.  A vowel is not a letter, it is a sound.  So the word "honest" starts with a VOWEL, or, just to be clearer, a "vowel sound."  Now, if this were Cockney England, somewhere around Eliza Dolittle's old guttertown, then the letter [h] in "historic" would be silent, leaving the initial sound of "historic" instead as what we as kids learned to call the short-i vowel sound, making "an" the correct article, no different than if we were to say "an idiotic perspective," or "an imbecilic notion."

Or, if an American were trying to somehow channel Monty Python, having a bit of fun, perhaps, and pronounce the word /Is-'tor-Ik/, then again, the "an" would be correct.

However, "an" before a perfectly good American /h/-word is an artificial creation by pretentious wannabes (kind of like how Mariah Carey sometimes lapses into a faux British accent on American Idol, I guess she thinks it makes her sound deep or something?), rather like those types who use "whom" indiscriminately instead of "who" because they think it sounds more erudite.

Bollocks, I say.

But let’s make it a little more interesting.  Let’s allow for the possibility that “an historical moment” is perfectly acceptable to say by an American who is not currently trying to sell some flowers to Colonel Pickering.  Let’s compare: "A history class" vs. "An historical moment..."  Now, I think we all can agree (even the pretentious types who would actually say “an historical moment with a straight face) that “an history class” is just, well, bad.

The key determining factor seems to be the stress of the [h]-initial syllable: "History" is stressed on the first syllable [his], therefore it is much less likely to lenite (weaken to near silence). In "historic(al)" the stress is on the second syllable [tor], not the first syllable [his], so the weaker (unstressed) syllable [his] is much more susceptible to lenition, becoming less, well, "consonantal." Vorfield states, "Before a word starting with a pronounced, breathy 'h,' use 'a.'" Which is exactly my point. The (currently, empirically) incorrect use of "an" at most signals, maybe, a gradual trending in American phonetics towards a softening of non-stressed word-initial [h].  But even that would imply an order of operations that is not extant: The pronunciation of "historical" has not changed over the last generation or two in American English; the only thing that has changed is the active choice to start using "an" beforehand instead of "a." 

Some years ago, I voiced this particular pet peeve to a favorite professor of mine (who shall remain nameless since I’m sure he doesn’t want me dragging him into my petty dramas).  An English professor by trade and a historical linguist and classicist by avocation, I figured he would be the perfect one to hear my desperate plea.  He responded thus:
“To get your adrenaline flowing over usage questions bespeaks an addiction to adrenaline more than a devotion to the queen’s idiom, but we all have our peeves.  You’ve put the case pretty clearly—and correctly—in your statement, but don’t expect reformation.  Even the Merriam-Webster Dictionary of English Usage—a ‘must have’ for language mavens—concedes that in speech ‘an historical’ prevails, even in ‘American’.  In that case, and a couple of others, the editors cop-out by letting us do as we please based upon our particular pronunciation of the ‘h’ word in question.”
*sigh*

My take is simply that people who say it are trying to sound important, and are deliberately sacrificing correctness at the altar of pomposity.

Which is clearly an heretical thing to do… (oh, snap!)

Or am I just a crusty old prescriptivist?  You tell me, I’m just A.S.K.ing…

Saturday, April 6, 2013

The Prisoner - What Classic TV Can Teach Us About Tech

[Updated August, 2017]

I have a love-hate relationship with technology in education. 

I love to use technology in my classroom.  In fact, in some ways, I can’t do without it.  Not long ago, I arrived to an 8:00 class all ready to do a scintillating lesson that required use of the computer and LCD overhead projector in a so-called “smart classroom.” The tech didn’t work; there was a problem with the toggle that switched the feed from the Elmo to the computer, and I could not get the computer monitor’s contents displayed on the big screen. For a few minutes I tinkered with it to no avail.  Then I shut down and restarted everything. No dice. I even tried to go all “The Fonz” (you younger teachers fresh out of school will have no idea what I mean by that, and believe me, I weep for you…) and aside from a few chuckles out from my students, reaped no positive results. I glanced up at the clock and realized I had wasted a good 6-8 minutes, out of a 50-minute class, on this pursuit. I paused, uncertain for a few moments. More time wasted. For a short while, I felt quite stupid that my brilliant and carefully choreographed lesson was dashed against the rocks of fickle fate.

See? Technology makes me so crazy, even my metaphors are stupid.

Long story slightly less long, I called an audible, lateral-tossed the football to an alternate me (stupid metaphor #2), and got on with the class, and my extemporaneous lesson was fine. Why?  Because as much as I was hoping to be able to use the technology, I still was well-prepared and conversant in the subject matter, I knew where I was in the course sequence, I had good personal relationships and rapport with my students, and I built the course to be responsive to their needs, as opposed to dragging them kicking and screaming through the course on some pre-determined and inflexible pace. In short, when the tech failed, the human element was there to save the day.

I’m not so sure we’re headed in a good direction with tech. It’s bad enough standardized tests have become practically the gold standard for educational assessment. It’s worse that rigid adherence to bullet-point lists of standards are all that is required to “prove” to an observer that the education is sound and of good quality. In fact, in many schools, as long as your lesson plans have the appropriate sections, list the links to the State Standards, and as long as your instruction follows a particular sequence of steps and a designated format, and as long as you incorporate all the relevant trendy buzzwords from the district’s educational philosophy du jour – probably the product of the ministrations of some high-priced and charismatic outside consultant – you are a “good teacher,” and it doesn’t really matter if your students are benefiting or not: as long as your instruction at least superficially follows the prescribed norms; there just simply isn’t time to provide more thorough analyses of teacher performance, so superficial indicators have to do.

But the superficiality does not stop there. An onslaught of trendy new technological platforms, gadgets and processes are threatening to take the human element even further away from the educational experience.  Now, when most people talk about integrating technology into the classroom, they’re talking about much more than simply projecting traditional content or using web-based communications to interact with students.  Now, students can do class "presentations" with no content, but they look good, and that's just as good as an essay, right? (Buzzword of the day: "alternative assessment.")

Now there’s talk of “a tablet for every student,” or “an iPad for every student.” Wot?

Have you ever seen how well students take care of textbooks? Notebooks? Their own papers? A school I worked at recently didn't even have procedures for being compensated for lost materials such as books, and used to lose some $20,000-$30,000 per year in unreturned supplies.  Just sayin'.

How bad has our love affair with tech gotten? Now, out of expediency (I say laziness, stupidity, and is there an adjectival form of "bandwagon?"), some institutions are even starting to computer-score essays. No, not multiple-choice tests... essays. For the record, I once submitted an essay to be machine-scored.  It got a perfect score (a 6, top score on a six-point rubric).  And it was a very well-written essay. (Duh.) Except for one thing – the essay was total nonsense.  Not only was it not even remotely related to the assigned prompt, but it was not internally consistent. If an Alzheimer’s patient wrote a paper while high on cocaine (yet somehow managing to maintain good grammar, syntax, punctuation, etc…) that would have been my essay.  I did it on purpose. I wanted to see what would happen.

Never put all your trust into something that cannot trust you back.  Except my ’04 Camry.  Love that thing. [Edit - As of August, 2017, it has 294,000 miles or so on it, and is really showing its age. It will not last to the new year, alas.]

I don't think tech is necessarily a bad thing, but I do get the distinct sense that tech is being forced into classrooms because of its "cool" factor and not because students (or faculty, for that matter) are developmentally ready and primed to receive the changes. I think that can and will have disastrous effects.  Too much tech without the human element and you basically have, well… The Matrix.

At just 17 episodes, the British TV series The Prisoner was short-lived, especially by modern standards.  The fact that it’s not a title that’s on the tip of everyone’s tongue might further lend one to think it irrelevant or – gasp! – a failure.  Make no such mistake.

Number Six, the show's ex-secret-agent protagonist, does have a lot to teach us.  My last post on this illustrious television show answered that age-old question: Can’t we all just get along?  (Correct answer: What are you selling?)  In this post, I direct my gentle reader to the episode entitled “The General.”  In it, Number Six learns that a fellow named “The Professor” has created a revolutionary educational process that, by hooking students/subjects up to a sophisticated machine (called “The General”) they may be given the equivalent of a three-credit college course in a matter of minutes.  Soon, The Village (the setting of the show, basically a black site for interrogating and breaking rogue agents) is aflutter with newly erudite scholars of “Europe Since Napoleon,” the title of the first such “course.”  Number Six quickly realizes that everyone who tries to describe what they have learned in the course does so word-for-word each the same as everyone else he encounters.  Realizing that the machine completely eliminates the normally clearly demarcated line “between knowledge and insight,” Number Six correctly deduces that it is intended to be used as a form of mind control in order to control the denizens of The Village and extract information from them. The Villagers’ apparent trust in the process makes them even more susceptible, and Number Six reasons he must destroy The General (which he does, of course) in order to save both himself and his fellow Villagers.

View the full episode here

My very next post will be a research-based look at my general angst about our rush to over-technologize American public education.  It will be longer and denser, but more “scholarly.”  And it will read rather as a “Part II” to this post.  Read it!

For now, I leave you with this question: Are we going too far, too fast? Should we not pace ourselves a little bit more? Will upping the tech really change the culture, which is perhaps the real source and reason for school failure? And what exactly is meant by “change the culture,” anyway?  And why am I asking so many questions, when I only said I was going to ask one?