Tuesday, April 30, 2013

Cinco de Mayo – A uniquely American holiday, courtesy of… México!

Outside of areas with large Mexican populations, Cinco de Mayo as a holiday is largely ignored.  Allow me to suggest that this perhaps is a mistake.  In schools, certainly, the history of this critically important day provides a valuable historical lesson, not just about our neighbor to the south, but about the intangibles that once defined us as a nation.

When asked what Cinco de Mayo is, most people – even, I learned after many years of teaching in California schools, many people of Mexican extraction – automatically report, erroneously, that it is Mexican Independence Day.  (That would be September 16, sorry.)  It is always sad when a people is disconnected from its heritage to such an extent that no one can even recall the significant formative moments in the formation of that culture’s psyche; can you imagine an American calling him- or her-self a “patriot,” but not knowing why there were 13 stripes on the flag, or not knowing why we recognize certain presidents’ birthdays, or what “the shot heard ‘round the world” was? 

In Mexico, Cinco de Mayo is acknowledged, but not celebrated as a national day with any real fervor on a national scale; official recognition of the day varies regionally, with individual states (yes, Mexico has states) deciding how the day is to be commemorated, if at all.  The occasion is celebrated with much more vehemence, pomp, circumstance and ornamentation in parts of the United States – those with a significant Mexican population, understandably – than in Mexico itself.  This makes intuitive sense, perhaps; people separated from their cultural center, if they have any desire to commemorate or celebrate their heritage and cultural identity, or any drive to maintain the same, will strive wherever possible to forge connections through ritual.  Islands of Mexicanity within the borders of the United States can therefore engage in these celebrations and feel a closer connection to their homeland, their distance, like absence, having ostensibly made the heart grow fonder. 

However, many American celebrations of Cinco de Mayo have degenerated into generic and heavily commercialized cultural celebrations; Cinco de Mayo is, in many American communities in the West and Southwest, merely an undifferentiated outpouring of nostalgia for a missed fatherland, a contrived concatenation of music, color, dance and food, with little – if any – acknowledgment (or even understanding) of the historical occasion which the day is meant to commemorate.  I recall that once a Mexican-American administrator in a school at which I spent an early part of my full-time teaching career, in an attempt to voice an elegant encomium to Mexico over the loudspeaker for Cinco de Mayo, informed the student body, perhaps a quarter of whom were Chicanos, that “today [was] the day we celebrate the Mexican declaration of independence.”  ¡Carajo, hombre!

That is a shame, because despite the occasional flare-ups of grotesque and arbitrary anti-Mexican sentiment extant in many sectors of the current American political landscape, Cinco de Mayo is, above all things, very much a holiday in the American spirit, and a significant movement in a critical historical period in our hemisphere worth getting to know.

The period between the 1760s and the 1860s was a momentous era in Western history. Thanks in no small part to Schoolhouse Rock, everybody knows the phrase “taxation without representation,” and has a vague idea that the American colonists were spurred on by outrage to fight for independence.  Anybody who cares to dig a little deeper quickly gets past the easy potential misconception that the outrage the colonists felt was simply petty, petulant and trite, (a la Occupy Wall Street – don’t get me started on that; it’s off-topic, just move along… ), like a child protesting his “unfair” bedtime and throwing a fit; in fact, the Revolution and subsequent Declaration of Independence (I’ll pause for a moment while you go look it up – yes, the war started before the Declaration was penned) were the inevitable end product of a principled line of thinking that had its seeds in the Enlightenment.  But the story does not end with the surrender of Cornwallis.

Proper historians may cringe at the truncated treatment I am about to give a rather complex series of political upheavals, but this plays better from 30,000 feet, so broad strokes only for a while.
 
The French Revolution, which lasted from 1789 to 1791, was directly influenced by the American Revolution, and the French National Constituent Assembly’s The Declaration of the Rights of Man and of the Citizen paid generous homage to both the Declaration of Independence and the Constitution.  One wonders if the last thing that went through Louis XVI’s head in 1793, besides a guillotine blade, was the idea that maybe signing the Treaty of Alliance with the American colonies in 1778 wasn’t such a hot idea after all.  The French Revolution stimulated the people of Haiti, then a French colony, to revolt, and for thirteen years, war raged in that tiny country.
  
Then, a few short years later, nearby Mexico caught the revolution bug but good, as Napoleonic War-era France distracted Spain for long enough to disrupt their attentiveness to their own colonies, giving Mexico an opening to follow Haiti’s lead, which they did.  This chain reaction of wars of independence cascaded down the spine of Mesoamerica into the South American continent, and by mid-century, 1849, to be precise, almost all of the modern-day Spanish-speaking nations had declared, fought for, and won their independence from Spain. 

Mexico, as a newly independent nation, had what one might understatedly call growing pains.  An almost constant state of civil war existed over the three-and-a-half decades between 1821 and 1857, with a shocking series of revolving-door presidencies (Santa Anna alone was named President some ten or eleven separate times) while that nation saw not one but three separate, distinct national constitutions implemented.  The anti-theocracy capitalist reformer Benito Juárez played an important role in the creation of Mexico’s 1857 Constitution, a document which set forth guidelines that included freedom of speech, freedom of assembly, a reaffirmation of the ban on slavery [Note/edit: Thanks to the poster at /r/mexico in reddit who alerted me to the fact that Mexico had banned slavery actually in 1829], the right to bear arms, and the familiar notion of “the consent of the governed,” captured in the Constitution’s Article 39:
Article 39, The Mexican Constitution of 1857:

La soberanía nacional reside esencial y originariamente en el pueblo. … El pueblo tiene en todo tiempo el inalienable derecho de alterar[o] modificar la forma de su gobierno. 
“[Our] national sovereignty resides in and originates with the People. … The People at all times have the inviolable right to alter or modify the form of their government.”   
Compare to the American Declaration of Independence:
“Governments are instituted among Men, deriving their just powers from the consent of the governed, — That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government…”
Juárez became Mexico’s next president, in 1858.  He borrowed funds heavily from European powers – England, France, Spain (yes, Spain) – to rebuild his war-torn country, but after a short few years, his creditors came calling, led by an extraordinarily zealous Napoleon III of France, who in response to Juárez’s non-repayment, invaded Mexico in 1861, driving Juárez into exile, and succeeded in installing Hapsburg stooge and eventual fall guy Maximilian I as the puppet Emperor of Mexico in 1864.

The Mexican people did not go gentle into that good night, however.  While the elite French forces generally made short work of whatever Mexican resistance stood in its way, there was one battle, at the village of Puebla on May 5, 1862, that, while not a key or decisive battle in the larger conflict, came to become the best known (at least by name) battle on Mexican soil – the Battle of the Alamo was technically fought in the Republic of Texas – in that nation’s history. Reports vary on the precise troop strength of the opposing armies: on the Mexican side, estimates run as low as 1,500 and as high as 4,500; on the French side, estimates are fairly consistent at around 6,000, though I have read some as high as 8,000.  What is certain is that the Mexican fighting force was perhaps one step up from rabble, while the French forces were arguably among the most formidable, well-trained, and well-outfitted in the world.  David, however, defeated Goliath, sending the French temporarily into retreat, the number of their casualties three to four times that of the stalwart Mexican defenders.

True, the French did come back, this time with an army several times larger, and they completed their national conquest by 1864. But it did not take.  By 1867, Napoleon III had withdrawn both his interest in Mexican involvement and his troops, leaving Maximilian without a lifeline.  Maximilian was overthrown and captured in 1867, where a repatriated Benito Juárez sentenced him to death by firing squad. 

In commemoration of the Battle of Puebla, which despite its ultimate futility showed the rest of the world a nation of unexpected guts and grit, and provided an emotional flashpoint – like the Alamo, the Spartans at Thermopylae, or the defeat of the Gunpowder Treason – to kindle a resurgent sense of national pride, Juárez declared May 5th a day of national celebration.  It never, however, really became an official federal holiday in Mexico:
“Battle of Puebla is not a general public holiday, as per the "artículo 74 de la Ley Federal del Trabajo", but is included as a public sector holiday (Las dependencias y entidades de la Administración Pública Federal, cuyas relaciones de trabajo se rijan por el Apartado B del artículo 123 Constitucional) in the separate "Decreto por el que se reforma el Artículo Segundo del Decreto por el que se establece el Calendario Oficial"
However, the symbolism of the strength showed by an isolated and outnumbered band of ragtag underdogs surrounded by adverse hostility became a go-to font of patriotic pride for displaced Mexicans within U.S. borders.  Non-Mexican Americans would do well to acknowledge this day as of equal significance to America’s own Declaration of Independence, for while the Mexico of the 20th and 21st centuries did/does not appear to have held up the glorious promise of its inspiring foundational period (for that matter, The U.S. seems to have betrayed its legacy in a number of ways as well, but that too is a topic for another day, another time, another blog), the day itself, and the battle it commemorates, are sterling examples of the spirit that once made America great. This is something that students – all Americans – need to understand.  Let Cinco de Mayo be celebrated in our classrooms nationwide.

Pride is not a sin.

Is it appropriate to recognize a non-U.S. holiday in the U.S.?  Does it set us down a slippery slope?  I dunno, I’m just A.S.K.ing… 

                      [Note: Read the follow up to this post here.]

Thursday, April 25, 2013

Oh, Those Golden Handcuffs

When I left my last school district, making well over $70,000 (those of you in California reading this will say “so what,” but in upstate New York, a house that would cost $500,000-$600,000 in the suburban San Francisco Bay Area where I used to live and teach only costs $125,000-$225,000), I found it nigh impossible to reestablish with a school district.

My resume looked pretty good – 18 years teaching secondary school and college, three teaching credentials (English, Spanish, ESL), experience being  a department chair (three times), a program manager (twice), and a curriculum designer (twice), oodles of experience with the school accreditation process (four cycles), a fair amount of committee work, and some educational technology experience to boot.  My references were excellent – a small stack of emphatically supportive letters from various people at different levels stretching back 24 months or so.

Then an interviewer in 2011 told me, mid-interview, that there was a problem with my experience.  They simply would not agree to pay anyone in accordance with their experience, citing the ample supply of eager teachers who would work for much, much less.  In my case, that meant that if they were going to offer me a job (which they did not, because I ended the interview) it would be at a salary of perhaps $43,000 instead of $73,000.

Vivek Wadhwa, a blogger and “top linkedin.com influencer,” also a Fellow at the Arthur & Toni Rembe Rock Center for Corporate Governance at Stanford University, wrote an inspiring article just today (4/25/13) on how industry de-values experience and expertise at its own peril.  An excerpt:
True, some older workers become complacent as they age. They become set in their ways, stop learning new technologies, and believe they are entitled to the high wages they earned at their peak. Their careers stagnate for a valid reason. But these are the exception rather than the rule. Whether it is in computer programming or entrepreneurship, older workers have many advantages—they still are the guru’s [sic].
This entry immediately followed, and was a direct sequel to, a post Wadhwa had written on 4/22/13, on the rampant ageism in the tech industry.  An excerpt:
It may be wrong, but look at this from the point of view of the employer. Why would any company pay a computer programmer with out-of-date skills a salary of say $150,000, when it can hire a fresh graduate — who has no skills — for around $60,000? Even if it spends a month training the younger worker, the company is still far ahead. The young understand new technologies better than the old do, and are like a clean slate: They will rapidly learn the latest coding methods and techniques, and they don’t carry any “technology baggage.” The older worker likely has a family and needs to leave the office by 6 p.m. The young can easily pull all-nighters.

What the tech industry often forgets is that with age comes wisdom. Older workers are usually better at following direction, mentoring, and leading. They tend to be more pragmatic and loyal...
The two essays, both absolute must-reads, are two sides of the coin of the realm in the modern world of work – expediency over expertise; malleability over maturity; deference over dynamism.  Employers want people cheap, who will do the job, loyally and faithfully even to a fault, and not get in the way of management’s mission.  And the fact is (and we can admit this as an empirical truth) older folks are less likely to have the kinds of flexibility required of that type of management style.

The redactio ad absurdum at the end of this slippery slope is a scene right out of Ayn Rand’s Anthem:
You shall do that which the Council of Vocations shall prescribe for you. For the Council of Vocations knows in its great wisdom where you are needed by your brother men, better than you can know it in your unworthy little minds. And if you are not needed by your brother men, there is no reason for you to burden the earth with your bodies…
...Thus must all men live until they are forty. At forty, they are worn out. At forty, they are sent to the Home of the Useless, where the Old Ones live. The Old Ones do not work, for the State takes care of them. They sit in the sun in summer and they sit by the fire in winter. They do not speak often, for they are weary. The Old Ones know that they are soon to die. When a miracle happens and some live to be forty-five, they are the Ancient Ones, and children stare at them when passing by the Home of the Useless.  (ch. 1)
Hyperbole? Perhaps.  But bit by bit, older teachers are being forced out in a war of attrition, either for financial reasons, philosophical reasons, or a combination of the two – refer to the saga of Gerald Conti for a refresher course on how principles (as well as principals!) are driving many teachers to abandon the career that many have lovingly dedicated their lives to – and once forced out, it is increasingly difficult for them to find purchase in another teaching position.  They’re just too expensive, and expertise has so little value anymore.

Not only do older teachers cost more (most schools use a salary schedule), but instead of being respected, the changing climate of America public education puts modern progressive mores in direct conflict with the values of most teachers 45+, which has the net effect of stigmatizing older teachers and their experience. I have had numerous hiring committees and district administrators tell me that it's simply not worth it to hire an experienced teacher, unless s/he can be paid a starting teacher price. Younger teachers are more malleable, more eager to please, and more willing to drink the Kool-Aid. There also seems to be a pervasive feeling that older, more experienced teachers, when they try to share their experiences with younger faculty – especially when and if their views diverge, as is more and more the case these days – are somehow hegemonic in asserting their wisdom and experience, as if that somehow invalidates the energy and youthful missionary zeal of younger teachers. It is incredibly divisive, and there seems to be an us vs. them, at least in public school systems I've looked at lately, especially in the current educational climate.

I used to mock professional athletes when they would “hold out,” demanding to be paid “what they’re worth.”  Now I kind of get it.

But here’s a final thought, a sort of counterpoint:  I’m teaching college now, as an adjunct on two campuses.  I’m waiting to hear if I made it through to the third round of interviews for a tenure-track position at one of them.  If I get the job, I will be ecstatic.  It does not bother me that the salary will likely be $20,000-$25,000 less than I would get as a high school teacher.  This is not all about money… there are more ways than just money to value an employee, and to demonstrate that value through a supportive and collegial environment and a teacher-friendly atmosphere.  More and more, public schools have neither.  (My college is, thankfully, an embarrassment of riches in this regard.)  At this point, I don’t know if an offer of $75,000 would be sufficient to bring me back, even though my family desperately needs the benefits, the health care, and the stability.
 
Gerald Conti is leaving the career that he loves after 28 years; I’m guessing his salary is far more than mine was when I left my last school district on "Salary Step 19."  He could hang on for two more years and become more fully vested in his retirement plan, netting him a significant bump in future disbursements.  He is choosing not to.  He is lucky to have the financial stability to be able to make that principled decision.  A lot of veteran teachers do not – they are chained to their job by the very real threat of not being able to find another one, at least not one that pays anything close to what they’re worth, locked in place by the Golden Handcuffs.

What is the key to unlock the Golden Handcuffs?  I dunno.  I’m just A.S.K.ing…

Thursday, April 18, 2013

One, linguist's, thoughts, on, the, Second, Amendment,

My undergraduate microeconomics professor used to express market dynamics in terms of “guns and butter.”  (Butter doesn’t kill people, bad diets kill people.)  Most conversations about guns outside the classroom, however, focus on meatier real-world issues.  Gun owners are afraid that someone is going to come and take their guns away.  And maybe they should be, since, properly analyzed, there is NO Constitutional protection of an individual’s “right” to own a gun. This does not mean that gun ownership is or should be illegal.  Just don’t look to the Constitution for help.  This linguist (that's me) explains why.

Barbara Newman writes:
The Latinate framers of the US constitution employed an ablative absolute in the Second Amendment: ‘A well-regulated Militia being necessary to the security of a free State, the right of the people to keep and bear Arms shall not be infringed.’ An interpreter who favoured regimen would argue that the ablative clause determines the sense of the main clause; hence, the state has the right to maintain an army. Those who favour the absolute, as American courts have done, bracket the militia clause and take the main clause to mean that citizens may own as many firearms as they choose. The difference between constructions amounts to roughly 12,000 murders a year
Setting aside Ms. Newman’s editorial digression, what is an ablative absolute?  An English absolute is a construction formed by a noun or pronoun, and a participle; it is an English reinvention of a classical bit of stylized Latin rhetoric, the “ablative absolute,” many examples of which can be found, with simple explanations, here.

What seems to be clear is that while an ablative absolute construction does not have any grammatical or syntactic force over the main clause of the sentence, it has strong causal  or semantic  force.  Read retired English teacher Mark Moe’s recent article on this idea here.

This logic even made its way to the Supreme Court. Here is a passage from the introductory section of the amici curiae brief, a Brief of Interest, submitted to the United States Supreme Court by a team of linguistics professors specializing in the history of English grammar and the linguistics of English in legal contexts (note the difference in comma usage compared to the Amendment as rendered by Newman - more on that later):
The Second Amendment reads: “A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.” Under longstanding linguistic principles that were well understood and recognized at the time the Second Amendment was adopted, the “well regulated Militia” clause necessarily adds meaning to the “keep and bear Arms” clause by furnishing the reason for the latter’s existence. The first clause is what linguists call an “absolute construction” or “absolute clause.” It functions by melding the sentence “A well regulated Militia is necessary to the security of a free State “together with the sentence “The right of the people to keep and bear Arms shall not be infringed” to express this thought: “Because a well regulated Militia is necessary to the security of a free State, the right of the people to keep and bear Arms shall not be infringed.” On its face, the language of the Amendment tells us that the reason why the right of the people to keep and bear arms shall not be infringed is because a well regulated militia is necessary to the security of a free State. The purpose of the Second Amendment, therefore, is to perpetuate “a well regulated Militia.”
Dr. Denis Baron, an English linguistic historian and professor at the University of Illinois, one of the authors of the amicus brief, writes in his blog, “Absolutes are grammatically independent, no doubt about it. But grammatical independence has always been narrowly defined, and it never excludes the clear semantic connection between an absolute and the rest of the sentence.”

So now let’s look at the Second Amendment again:
5c.  “A well regulated militia, being necessary to the security of a free state, the right of the people to keep and bear arms, shall not be infringed.”
Sentence 5c illustrates the absolute construction in bold:  “a well regulated militia” is the noun phrase, and “being” is the participle.  But what about that dratted comma?  Adam Freedman offers a sensible observation:
[T]here could scarcely be a worse place to search for the framers’ original intent than their use of commas. In the 18th century, punctuation marks were as common as medicinal leeches and just about as scientific. Commas and other marks evolved from a variety of symbols meant to denote pauses in speaking. For centuries, punctuation was as chaotic as individual speech patterns.  
The situation was even worse in the law, where a long English tradition held that punctuation marks were not actually part of statutes (and, therefore, courts could not consider punctuation when interpreting them). Not surprisingly, lawmakers took a devil-may-care approach to punctuation. Often, the whole business of punctuation was left to the discretion of scriveners, who liked to show their chops by inserting as many varied marks as possible.  
Another problem with trying to find meaning in the Second Amendment’s commas is that nobody is certain how many commas it is supposed to have. The version that ended up in the National Archives has three, but that may be a fluke. Legal historians note that some states ratified a two-comma version. At least one recent law journal article refers to a four-comma version.
James Madison was classically educated, and it is more than fair to say that he would have known the classical ablative absolute style.  If the sentence is rendered in such a fashion, and the last comma, for which there seems to be no earthly grammatical reason (in Madison’s century or ours) for its existence, eliminated, this much clearer Second Amendment redux (also alluded to in the SCOTUS amicus brief) is yielded:
5d.  “A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.”
The gun lobby’s collective urge to say “A-ha!” at the now unified “the right of the people to keep and bear arms shall not be infringed” must be stifled (or perhaps, “infringed?”) by the semantic force of the absolute phrase, which clearly subordinates the truth-value of the main sentence to the conditionality established by the prefatory content, similar to:
    “Inasmuch as…”
    “Because…”
    “Since…”                            >       “…a well regulated military is necessary…”
    “Due to the fact that…”
    “As…”
    “As long as…”
In other words, the right to possess a gun is predicated on the existence of a State-backed civilian militia; in the absence thereof, the sentence’s main clause becomes vacuous, moot.

One does not need to even refer back to history, that American law is rooted in British Common Law (an article in the Huffington Post looks into this a little bit with regards to the Second Amendment specifically), to recognize that a purely linguistic analysis of the Second Amendment fails utterly to confer upon Americans a Constitutional right to possess guns.

BUT WHAT ABOUT SO-CALLED "NATURAL" RIGHTS?

An interesting conclusion by the authors of an anonymous blog (yeah, i know, bad scholar, BAD scholar...) offers these as food for thought:
…owning guns was an everyday reality for virtually everyone at the time the amendment was written. The idea that a hunting rifle could be banned would have been as outrageous as banning cars today. It was some people's means of food and money -- and I don't mean shootists, I mean hunting.[ … ]

As Patrick Henry said, "Guard with jealous attention the public liberty. Suspect every one who approaches that jewel. Unfortunately, nothing will preserve it but downright force. Whenever you give up that force, you are inevitably ruined...." This is the attitude of the Second Amendment. The People have the right to maintain the ability to keep the Federal government in check against tyranny. The right to own weapons privately is assumed in this, as is the right to self-defense (another item already a part of British common law before the Constitution was drafted). Early commentary on the amendment confirms this view. The Boston Journal of the Times printed in 1769, commenting on the British Bill of Rights and the King's attempt to disarm the colonists, that "It is a natural right which the people have reserved to themselves, confirmed by the Bill of Rights, to keep arms for their own defence." The right to own a gun was viewed as a NATURAL RIGHT. One of the self-evident, unalienable rights that the people had, and commissioned government to protect .
There are two issues here. The first is survival.  The idea of guns for survival is vestigial, like a useless organ or withered limb, and has little or no force in the calculus of determining so-called “gun rights” in our time.  The second is defense, and specifically, defense against one’s own country.  The Declaration of Independence does call for rights-respecting and freedom-loving people to rise up and revolt when it proves necessary:
“But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.” 
Certainly it is not possible to do so against a government that has a standing army, without arms oneself?

These issues, while worth debating, are outside the extant question, however.  Neither finds purchase in the wording of the Second Amendment, under any of the interpretations, phrasings or punctuation arrangements listed in this treatment, anyway.  And to take the conditions of the then and apply carelessly them to the now, is to be guilty of a version of the fallacy of false etymology or false definition.  (Though it might well tickle one who was opposed to the proliferation of guns to suggest to overzealous gun advocates that, okay, sure, the Second Amendment means exactly what you think it means, but the definition of “arms” is frozen to only include technologies available at the time of the document’s writing.)

If there is a natural right to gun possession, then it must be an argument based on principles, not on the wording or the grammar of the Second Amendment, or whether the Second Amendments is properly rendered with one, two, three or four commas; it is an argument for philosophers.  If there is a blanket right to own a gun, it exists outside the Constitution, definitively.  [Comment:  With regard to principle, I will say that I believe that implicit in the rights to life, liberty and property is the right to defend that life, liberty and property.  But that's not the issue of this piece; that's an issue of philosophical principle, not linguistics.]

About those philosophical principles, I have said, and will say nothing (else).  I’m not calling for the abolition of all personal firearms (as some will surely accuse). Nor, in fact, am I even calling for the abolition of ANY firearms.  As a linguist, simply put, the “right to bear Arms,” as the phrase is used (wielded?) nowadays, is NOT extant in the Constitution.  

Now, put that in your musket and fire it.  (Now, put that in your musket, and fire it?) 

Dratted commas

Sunday, April 14, 2013

In Hartford, Hereford and Hampshire, hurricanes hardly ever happen!

If one more news personality places the article “an” before the word “historic” or “historical,” I’m going to have an hissy fit.

See?  Sounds pretty stupid, doesn’t it?

Editor Judy Vorfield succinctly sums up the situation here, where she writes: “Using ‘an’ is common, but not universally accepted by experts.”  A gentle and noncommittal way of saying it ain't technically correct.  In other words, the usage falls into that ugly and contentious prescriptivist (it’s against the rule, so don’t do it) versus descriptivist (it’s the way people really talk, so it’s de facto legit) pissing match.

"An historic change in leadership is taking place..."  Oy, ‘eaven ‘elp me…

First of all, "a," not "an," is used before a consonant-initial word. And let me be clear for the non-linguists out there... A consonant is not a letter, it is a sound.  A vowel is not a letter, it is a sound.  So the word "honest" starts with a VOWEL, or, just to be clearer, a "vowel sound."  Now, if this were Cockney England, somewhere around Eliza Dolittle's old guttertown, then the letter [h] in "historic" would be silent, leaving the initial sound of "historic" instead as what we as kids learned to call the short-i vowel sound, making "an" the correct article, no different than if we were to say "an idiotic perspective," or "an imbecilic notion."

Or, if an American were trying to somehow channel Monty Python, having a bit of fun, perhaps, and pronounce the word /Is-'tor-Ik/, then again, the "an" would be correct.

However, "an" before a perfectly good American /h/-word is an artificial creation by pretentious wannabes (kind of like how Mariah Carey sometimes lapses into a faux British accent on American Idol, I guess she thinks it makes her sound deep or something?), rather like those types who use "whom" indiscriminately instead of "who" because they think it sounds more erudite.

Bollocks, I say.

But let’s make it a little more interesting.  Let’s allow for the possibility that “an historical moment” is perfectly acceptable to say by an American who is not currently trying to sell some flowers to Colonel Pickering.  Let’s compare: "A history class" vs. "An historical moment..."  Now, I think we all can agree (even the pretentious types who would actually say “an historical moment with a straight face) that “an history class” is just, well, bad.

The key determining factor seems to be the stress of the [h]-initial syllable: "History" is stressed on the first syllable [his], therefore it is much less likely to lenite (weaken to near silence). In "historic(al)" the stress is on the second syllable [tor], not the first syllable [his], so the weaker (unstressed) syllable [his] is much more susceptible to lenition, becoming less, well, "consonantal." Vorfield states, "Before a word starting with a pronounced, breathy 'h,' use 'a.'" Which is exactly my point. The (currently, empirically) incorrect use of "an" at most signals, maybe, a gradual trending in American phonetics towards a softening of non-stressed word-initial [h].  But even that would imply an order of operations that is not extant: The pronunciation of "historical" has not changed over the last generation or two in American English; the only thing that has changed is the active choice to start using "an" beforehand instead of "a." 

Some years ago, I voiced this particular pet peeve to a favorite professor of mine (who shall remain nameless since I’m sure he doesn’t want me dragging him into my petty dramas).  An English professor by trade and a historical linguist and classicist by avocation, I figured he would be the perfect one to hear my desperate plea.  He responded thus:
“To get your adrenaline flowing over usage questions bespeaks an addiction to adrenaline more than a devotion to the queen’s idiom, but we all have our peeves.  You’ve put the case pretty clearly—and correctly—in your statement, but don’t expect reformation.  Even the Merriam-Webster Dictionary of English Usage—a ‘must have’ for language mavens—concedes that in speech ‘an historical’ prevails, even in ‘American’.  In that case, and a couple of others, the editors cop-out by letting us do as we please based upon our particular pronunciation of the ‘h’ word in question.”
*sigh*

My take is simply that people who say it are trying to sound important, and are deliberately sacrificing correctness at the altar of pomposity.

Which is clearly an heretical thing to do… (oh, snap!)

Or am I just a crusty old prescriptivist?  You tell me, I’m just A.S.K.ing…

Saturday, April 13, 2013

Shakespeare, Ibsen, The Last Unicorn, and a Big Red Rag...

[Updated August, 2017]

I used to wonder, when I was in Mr. Fischer’s English 12 class, studying Shakespeare… When we do literary analysis on the works of long-deceased authors, how do we know what these authors really meant?  Maybe all of the pretension that we heap posthumously on their work is just that – pretense, and representing nothing more than a generations-long collective best-guessing effort, that through transmission, becomes fossilized into quote-unquote scholarly analysis.  Maybe Shakespeare was just writing some cool, edgy stuff to sell tickets?

My mind goes back to my early undergraduate years at Cornell.  The year was 1988, maybe 1989.  I was up late one night, studying, in an unoccupied room in stately Goldwin Smith Hall.  As fatigue, boredom and frustration closed in on me from all sides, I took up a piece of chalk and set to declaring my frustration, in the way that only the 17 or 18-year-old I could, in the form of a colorful metaphor, possibly involving a bodily appendage not traditionally used for or while studying.  It was silly, random eruption of angst.  At some point, I left the building.
 
Sometime later that week or month, I happened across a copy of The Big Red Rag, at the time a feminist newspaper on campus.  (I think the title has remained, but it’s now an arts and entertainment publication.  Someone correct me if I am wrong?)  As I was flipping through its pages, I came across a graphic, in the middle of which was prominently displayed the very sentence I had written, and one of the Rag’s staff writers - actually a girl who had been in one of my Freshman Writing Seminars the previous year, I recognized the name - had performed an impressive deconstruction/analysis, word-by-word, of how the sentence spoke to my massive insecurities (and my attempts to compensate for them), my mommy issues, my desire to rule the world, and how I was undoubtedly a physical incarnation of the malevolent wave of misogyny that held sway in the world as she perceived it.  It was an impressive display of skillful and erudite analysis being guided by (since the original author, moi, was unavailable for comment) the desires of the analyst to conclude what she wished to conclude.  (Anybody remember Charles Manson’s “selective” interpretation of The Beatles’ “Blackbird?”) It could have been satire, I suppose.  It’s hard to tell.  If it was satire, it was artfully done.  Bravissima. If it was serious, well…

Which brings me back to Shakespeare.  How do we know that the interpretations of events we teach/learn are definitive?  How do we know what the artists intended?  How do we know what was going the mind of The Bard?  Or John Donne?  Or Edgar Allan Poe?  Or James Madison?  (What was the intent of the Framers with regard to the Second Amendment? That debate has been making the rounds lately, and I’m sure to tick off more than a few gun nuts and Libertarians when I publish my grammatical interpretation of what the Second Amendment really means…) I even read an analysis of Robert Frost’s “Birches” once that claimed that the up-and-down movement of the Frost's birch tree is a metaphor for nothing more profound than sex, or perhaps onanism. (You look the word up yourself, this is a G-rated article.) The link to that original article is now dead, but it is referenced here.

I have erstwhile written about my feelings about the “inflation” of the importance of the verse of Tupac Shakur, and don’t even get me started on the pretense heaped on certain celebrated practitioners in the art world. But I’ve always considered myself to be cynical and inquisitive enough of a critical thinker not to be sucked in by the alluring complacency of the surety that I know what’s what.  Still, when I read Henrik Ibsen’s Peer Gynt (in translation), and found a quote I recognized from a much later work, I thought I had made the discovery of the century.  In the play, Peer, the title character, embarks on a surreal set of adventures – rather like Huck Finn crossed with Odysseus – during one of which he meets with a “Voice from the Darkness,” the great Bøyg:
PEER   [tries to force a passage at another place, but strikes against something]. Who are you?
THE VOICE   Myself. Can you say the same?
PEER   I can say what I will; and my sword can smite! Mind yourself! Hu, hei, now the blow falls crushing! King Saul slew hundreds; Peer Gynt slew thousands! [Cutting and slashing.] Who are you?
THE VOICE   Myself.
PEER   That stupid reply you may spare; it doesn't clear up the matter. What are you?
THE VOICE   The great Bøyg.
PEER   Ah, indeed! The riddle was black; now I'd call it grey. Clear the way then, Bøyg!
THE VOICE   Go roundabout, Peer!
PEER   No, through! [Cuts and slashes.] There he fell! [Tries to advance, but strikes against something.] Ho, ho, are there more here?
THE VOICE   The Bøyg, Peer Gynt! the one only one. It's the Bøyg that's unwounded, and the Bøyg that was hurt, it's the Bøyg that is dead, and the Bøyg that's alive.
PEER   [throws away the branch]. The weapon is troll-smeared; but I have my fists! [Fights his way forward.]
THE VOICE   Ay, trust to your fists, lad, trust to your body. Hee-hee, Peer Gynt, so you'll reach the summit.
PEER   [falling back again]. Forward or back, and it's just as far;- out or in, and it's just as straight! He is there! And there! And he's round the bend! No sooner I'm out than I'm back in the ring.- Name who you are! Let me see you! What are you?
THE VOICE   The Bøyg.
PEER   [groping around]. Not dead, not living; all slimy; misty. Not so much as a shape! It's as bad as to battle in a cluster of snarling, half-wakened bears! [Screams.] Strike back at me, can't you?
THE VOICE   The Bøyg isn't mad.
PEER   Strike!
THE VOICE   The Bøyg strikes not.
PEER   Fight! You shall
THE VOICE   The great Bøyg conquers, but does not fight.
It’s that last line that struck me. One of my favorite novels is Peter S. Beagle’s The Last Unicorn. (TLU was made into a Rankin/Bass animated feature film in 1982, screenplay by the author, starring the voices of Alan Arkin, Mia Farrow, Angela Lansbury, Rene Auberjonois, and mega-geek-cred actor Christoper Lee. Beagle earns extra geek cred for having written and directed the Trek: TNG episode “Sarek.”) In the novel, the pathos-ridden ageless and timeless Schemndrick the Magician befriends the last unicorn in the world.  Together, they travel far and wide, and eventually run up against the Red Bull, the creature responsible (sort of) for the disappearance of all the other unicorns.  Of the Bull, Schmendrick says, “The Red Bull never fights....He conquers, but he never fights.”

My brain exploded.

The Bøyg is a mysterious creature who exists outside what might be considered space-time.  He is part Tom Bombadil, part Yog-Sothoth.  I’ll let that sink in.

The Red Bull, too, “appears” from seemingly nowhere; it is unclear whether he has any real physical form, or if the Bull assumes physical form only to interact with the characters.  The caverns beneath the castle of wicked and broken King Haggard, who has more than one major secret, are said to be where the Bull’s lair is, but it is unclear who is master and who is servant.  The Bull may even be Haggard himself, somehow. 

They both conquer, but do not fight.  I became instantly certain that Beagle’s usage was a deliberate homage to The Bøyg.  I was absolutely sure of it.  How could it not be so?

Then I looked him up. Beagle, not the Bøyg or the Bull. (Living authors are a great treasure!)  And I asked him directly, via electronic message:
Hi. I'll try not to geek out too much. Huge TLU fan, and a high school English teacher in NY who is using TLU in class. I noticed that the words "he conquers but does not fight," used to describe the Red Bull, are also the words used by Ibsen (in translation) to describe the beast (The) Bøyg in Peer Gynt. I can find no scholarly mention of this curious connection, not even on fan-sites and other delicious outposts of good-natured geekery. Is the Red Bull an homage to the Bøyg? (And if so, I'm going to really have to give the Ibsen a closer read...) Thanks!
And he responded (in part):
I hate to admit this, because it reflects badly on my magpie education, but while I know a number of Ibsen's plays, I don't really know "Peer Gynt" well enough to quote from it. (Fats Waller throwing in left-hand licks from "In The Hall Of The Mountain King" for his own amusement is about as far as I get....)
If the Red Bull represents anything at all, it's the utterly unreasoning fear that I've seen take over entire populations over and over: having grown up during the Red Scare of the 1950s, I'm now seeing exactly the same blind panic in the face of the supposed World Jihad. As a Kentucky friend of mine used to say, "Some things'll scare you so bad, you'll hurt yourself." I think that's what the Red Bull's really about.

But I love even being thought of in the same breath with Ibsen. Thank you!
So much for my theory.

Tolkien was famous for not fessing up to who or what exactly Tom Bombadil is: Is he a nature spirit? A Vala? Eru Ilúvatar himself?  When someone who is not J.R.R. Tolkein (as we all, by definition, are not) makes his or her claim, however well-defended a thesis, is it really anything more than a best guess that “seems to fit the facts?” 

In the end, who are we to say definitively what character x in short story y represents, or what poem a by poet b means, or what artist p was feeling or intending to communicate when s/he painted abstract canvas q?  The secret, unless written down by the author, dies with him or her, at which point everything is more or less conjecture, isn’t it?  Put another way: is the message-directionality of expressive art forms from the writer to the reader or from the reader to the piece?  Or both? Put yet another way: from the perspective of the author (poet, playwright, lyricist), is there one “correct” interpretation of a piece, and everything else is “incorrect?” or do such creators surrender their pieces to the minds of the masses? 

If the latter is the case, then maybe that girl back at Cornell was right about me, and maybe Frost just liked to… you know (don’t make me go there).

Friday, April 12, 2013

What Lots of Teachers Think but THIS Teacher Is Not Afraid to Say

I'm afraid this is a very angry post, though not without purpose, and hope. I've removed most of the profanity from my earlier draft.  That's as close as I'll get to "nice."

A video is circulating around the internet right now called "What Lots of Teachers Think But Are Afraid To Say."  It is a sweetly and very reasonably narrated short piece that gently pleads for parents to recognize the realities of modern public education, and come to the table more informed and aware. Sounds good to me.  "Please, educate yourselves; have a voice on an issue..." the narrator intones.  Much of what she says sounds good to me, in a general sense.  She is, in my opinion, not angry, not urgent, not insistent enough. (One of the advantages of the high ground is that it's great for raining down arrows.) But  she's right about one thing: many teachers are afraid to say these things to to anyone other than a fellow teacher.

Actually... a few of us aren't scared to say it; teachers just need to be prepared to face the consequences. If you do say these things, administrators will not want you at their school. They don't want teachers to be honest with them; they CERTAINLY do not want teachers to be honest with parents, and as teachers, we are regularly instructed not to be honest with our students.

Teachers have to be prepared to put it all on the line. If you get transferred to a less desirable school out of spite? Bring it on, bitches. If you get canned? Bring it on, bitches. If enough teachers just learn to say, "Bring it on, bitches," or choose to do what Gerald Conti did, eventually, with patience, time, and sheer numbers, eventually, teachers will win the day.

Parents with children in the schools, many of them, anyway, will understand. They may even appreciate it.  There's a lot of anti-teacher rhetoric among government, district/state level administration and certain for-profit educational publishers, but it's subtle, insidious. The really visceral anti-teacher rhetoric is coming from pundits and certain highly vocal (often anonymously, of course) swaths of a grossly uninformed general public, both of which can be safely ignored. And the Pearsons, Common Cores, Say Yeses, etc.... of the world would not dare ratchet up their subtle anti-teacher rhetoric to the level of actually calling teachers out in an obvious public way. A principled assault by teachers against the system that is screwing them over can ONLY result in victory for teachers if their commitment doesn't flag.

But the first time teachers start to waver in fear (of their job, or whatever), the first time they swallow the "you've got to pick your battles" Kool-Aid, the first time they surrender one IOTA of their sacred charge to the cadre of rent-a-morons charged with gradually making teachers and and their livelihood obsolete -- that's when all hope is lost. Teachers cannot just close their classroom doors, pretend that their four walls are all that matter, bite the pillow, and pray that it ends soon.

OR ELSE IT WILL NOT.

Sunday, April 7, 2013

Educational Technology: Purchase Should Not Pre-date a Plan.

[Updated August, 2017]

This is (sort of) a sequel to my lighter, fluffier post from earlier.  That was the jab, this is the uppercut.  This will read better after having read that one.  Warning: This post is a bit more dense than most.  Educators will be fine, laypersons might need a Jolt Cola or something to get through it...

I noticed technology’s real and tangible impact for the first time as a teacher in the realm of mathematics.  When I was a student in high school and college, all math textbooks had an appendix consisting of various tables and charts.  When I needed the sine or cosine of a certain angle, or had to calculate a logarithm, I would consult a chart in the back of the book, and the figures would be there for me.  If the precise figure that I needed was not there, I would perform a mathematical interpolation to calculate the number I needed.  One day, sometime in the early 90s, when I was a new-ish teacher, I had the opportunity to work with students in mathematics, and I noticed that these charts were not in their (newer) textbooks.  I searched everywhere in the book, and then I realized the hard truth: Of course they weren’t there. Every student had a scientific calculator to more quickly and precisely do the job for them.  I also noticed, however, that none of the students really understood the relationships between sine and cosine, what interpolation was, or the greater mathematical context of logarithms.  They only knew how to “get an answer.”  I became very fearful for this generation of young students.

It has only gotten worse.

As a former foreign language teacher, I have likewise found auto-translators, computerized dictionaries, spelling-checkers and grammar checkers to be double-edged swords. A fun exercise:  Take your favorite short story, essay or article.  Copy the fist 2-3 paragraphs into Google Translate.  Translate to any common high school second language (Spanish, French, whatever…).  Then cut the foreign-language translation and paste in back into the translation engine, and re-translate it back to English.  Behold the mutated crap that the process yields.  This is what it looks like to a Spanish teacher when a kid turns in something in Spanish that has been auto-translated instead of organically written.

As an English teacher, I have found the Internet to be an incredible source of content, but also a tempting opportunity for plagiarism in what seems to be an increasing tendency towards immediate gratification and lazy shortcut-taking.  Just this semester [Spring 2013], I have logged six egregious incidents, one of which resulted in an expulsion (the student was apparently a multiple offender).  I explored the dark side of human nature with regards to plagiarism in an earlier post.

To successfully incorporate technology, a teacher must therefore know what to incorporate, when to incorporate it, how to incorporate it, and most importantly, must know what technology can and cannot do. Teachers need to know their students’ needs, and be able to merge the technology seamlessly into a carefully wrought educational plan; computers are not like sprinkles on a cupcake – just putting them there does not make things sweeter.  (Or maybe it’s better to say that it is like sprinkles on a cupcake; they make things look better and fancier, but actually do nothing to improve the quality of the cake itself.)  Lastly, it is crucial for teachers to know how to educate themselves about technology – where to go for resources, questions, information, and help. 

Technology is not a panacea, nor can technology swoop in and save the world for teachers, programs, schools and districts in peril.  As a teacher, lack of funding is often blamed for lack of success in the classroom; similarly, lack of resources is often invoked as a cause of woe.  The problem with the way that these complaints are framed is that the clear implication is that more money and more resources would miraculously clear up the problem(s).  And given that educational technology can be very cost-intensive, an axiom is set up that does not necessarily compute:

        (Fig. 1)  More money --> More technology --> Better education

A number of studies show that this statement, though perhaps intuitive, is utterly unsupported, the outcome of which is all too commonly visible in schools everywhere: “…an overemphasis on hardware with scant attention paid to the pedagogical and curricular frameworks that shape how the computers are used is common in educational technology projects throughout the world” (Mark Warschauer, “Demystifying the Digital Divide”).

Warschauer tells of two situations where merely throwing resources at a perceived problem did little to resolve it. An effort by the government of India to provide computer and Internet access publicly to children, in what was dubbed a “minimally invasive education” (Warschauer, “Reconceptualizing the Digital Divide”) project, failed when the social structures were not put into place to monitor and instruct and collaborate in the effort.  A more poignant example perhaps can be found in the town of Ennis, Ireland, population 15,000, which was awarded some $20+ million in 1997 as part of digital grant program to help bring technology-starved Ireland, rapidly emerging from the third world to the first world, into a level to technological sophistication befitting a country on the world stage.  Warschauer reported:
“The prize money that Ennis received represented over $1,200 US dollars per resident, a huge sum for a struggling Irish town.  At the heart of Ennis’s winning proposal was a plan to give an Internet-ready personal computer to every family in the town.  Other initiatives included an ISDN line to every business, a website for every business that wanted one… Ennis was strongly encouraged… to implement these plans as soon as possible.”
Alas, a 2000 visit to Ennis revealed that many of the programs had been disbanded or abandoned, and many of the computers had ended up on the black market.  The technology had been imposed upon the people, and had not been integrated into the people’s social structure.  Warshcauer paints these as cautionary tales for American schools and school districts newly aglow in the warm light of technology, and stresses what he calls “technology for social inclusion.” (The concept of social inclusion is tied in to Freire’s critical literacy, and the notion that educational processes should to some degree invoke, validate, utilize and incorporate those social practices, values and priorities to make the process salient and meaningful for the language learner, but that’s perhaps a blog post for another day; I applaud the general concept, but not the extent to which many Freireans tend to embrace the notion that education is somehow synonymous with hegemony.) 

The goal then, is not to merely heap technology on a people, as if technology were a grand paradigm-leveler that would “even out” or somehow render more manageable all societal, cultural and traditional differences in the world.  Likewise it is folly to assume that technology may be equally applied to all areas and all people without special consideration of how to integrate it. Warschauer pleads for what he calls “culturally-appropriate interaction” (Warschauer, “Language, Identity and the Internet”). Citing difficulties encountered when computerizing schools in deeply traditional Hawai’i, Warschauer makes an observation that must be applied to the educational realm at large: “A number of patterns of Hawaiian interaction have been identified, and these patterns are all too often at odds with how classroom instruction is organized.”
 
Warshcauer’s comments go to the heart of the difficulties that many have with technology’s incorporation into the classroom and into society in general:  It is so prevalent, so powerful, and very quickly growing so important, that either you’re with it, whether or not it fits into your traditions or experiences, or you must abstain from it altogether.  In this he speaks of what some have called “intellectual colonialism” (Anatoly Voronov).  This is the “progressive” (read: “guilt-fueled self-flagellation”) notion that the Internet itself, with the vast majority of its sites in English, and the text-based, literacy-dependent format of its presentation all speak to a subversive re-colonization of the world by White Euro-American cyber-literati. Even Warschauer struggles with this dichotomy:  “To use the Internet fully usually requires access to resources … which are only available to a minority of the world’s people.  In that sense, the Internet can heighten unequal access to information and power.  But in other senses, the Internet is the most liberating medium ever invented” (Warschauer, “Does the Internet Bring Freedom?”). And true, it is well worth it to consider that your students may well cut across all cross-sections of language, culture, poverty, upbringing, custom and expectations. Still, the indictment is a telling one, suggesting on the part of the accuser the errant and overzealous belief that technology is education, as opposed to just one component of a full educational experience.  My advice?  Recognize the “liberating” aspects of technology, as Warschauer calls them, and embrace those aspects even, but never forget that it is the educational process in toto that can and will liberate youth.

Calling technology a highly liberating medium might seem to reinforce the idea that the “poorer” areas are more in need of technological enhancement, and all poor schools need is a nice fat educational technology grant and all will be solved. This, however, makes the faulty assumption that the so-called Digital Divide is an economic Divide.  In Warschauer’s writings, he claims that the so-called Divide is not one of access to technology or techno-dollars.  It is, rather, a divide in literacy, both in general and literal sense, but also in the sense that in lower-performing schools – which also tend to be poorer schools, hence the common confusion – technology tends to be seen as a fix, and is thrown at a problem, not carefully and fully integrated with the pedagogical infrastructure and nurtured with proper teacher training and support.  (Though, I should point out editorially, it is not unreasonable to surmise that this type of training/consultancy is cost-prohibitive in poorer schools, hence its conspicuous absence.) This idea of the problem being not the lack of technology, but the lack of successful implementation of existing technologies, is pervasive.  The lesson for any new teacher is clearly to learn and understand the various applications of available software, hardware, media, platforms, devices, sites, and services, and become fluent in their use.  There is and always will be something new and more modern or flashy, so the alternative is to always be unsatisfied with what you have, and attitude that can only taint a teacher’s daily work with its pessimism.  Remember that money, like technology is a way to get to a destination, not the destination itself.
 
Case in point, Warschauer has suggested that an approach involving “a combination of well-planned and low-cost infusions of technology with content development and educational campaigns targeted to social development is surely a healthy alternative to projects that rely on planting computers and waiting for something to grow” (“Demystifying”).   The United States has its share of poorer areas; one does not need to travel to India or rural Ireland to find abject poverty.  One also does not need to leave the United States to see tragic wastes of technology dollars on classroom situations not ready for the jump to hyperspace:
“The reports were nearly finished, and the teacher was feeling pleased with the results. When I asked to see one, she steered me to a young man whose report she felt was in particularly good shape. Sure enough, as the student clicked through the presentation, I was immediately struck by the clean graphics, the strong colors, and the digestible writing. Then, suddenly, he was done. This was the extent of his report. But its content was no deeper or more complex than what one commonly sees in civics papers done elsewhere, with pencil and paper, by seventh and eighth graders. Mystified, I asked the student how he'd used his time. He estimated having spent approximately 17 hours on the project, only seven of which had been devoted to research and writing. The rest went to refining the presentation's graphics” (Oppenheimer, “Point. Click. Duh”).
In this case, a Massachusetts 11th grade classroom in a relatively well-off area, the millions of dollars that had gone into funding for computers had not gone into adequate training for instructors, who in turn were having students use the computers to do little more than high-tech mimicry of the functions of the pens and pencils that they used to use.  Computers had no higher purpose; there was no gestalt quality to the instruction or the learning, post-technology.  Clearly my earlier axiom (Fig. 1) is revealed to be bogus. The Massachusetts scenario is a delicious, though tragic, example of Warschauer’s characterization of the Digital Divide, which I described earlier.  There was adequate funding in the Massachusetts example, but that clearly did not solve the pedagogical problem of incorporating technology to enhance the learning experience.

Again, the key is well-trained, careful, and skillful planning and incorporation of these technological elements into instructional practices.  Skill, care, and planning are not economically-driven characteristics, and a motivated and industrious teacher can make technology work for him/her. Warschauer simply says, “The key issue is not unequal access to computers but rather the unequal ways in which they are used,” (“Demystifying”).  This is followed up dramatically with a comparison of two studies conducted in California’s Anaheim Union High School District, in which it was clearly demonstrated that, with a nod to Bernie Poole’s Eight Pillars (see below), the program in which online content was supported with and supplemented by “face-to-face teacher and peer interaction” was much more successful than the programs that relied entirely upon students’ intrinsic motivations and auto-tutorial sense of responsibility in the face of little or no feedback or support.

There are numerous scenarios in which a fully computerized classroom can be a benefit.  It is important, however, in situations like this to recall that it is not $60,000 in laptops (or iPads, or tablets) that makes a program work for the students; it is the careful and thoughtful preparation by a team of dedicated teachers and technology specialists who do more than merely offer the computers of as divine sacrifices.  Proper integration of technology into the classroom, ironically enough, requires a human component, offered by Bernie Poole in the form of eight “pillars,” or commandments:
1. Active support must come from the top.
2. A non-dictatorial approach is best.
3. Every school should have a core of teacher-computerists.
4. User-friendly technical support must be available, ideally onsite and on demand.
5. Teachers must come first.
6. Parents and students must be involved in the evolutionary process.
7. An ongoing technology training program must be in place.
8. Teachers must be given the time and freedom to restructure the curriculum around the technology.
Poole’s Eight Pillars are part of an entire online book that is available for free download.  I would strongly recommend that any teacher who is to have a strong technological component in their instruction read it first.  [What is amazing is that even though the piece has not been updated in 11 years (2006), the eight pillars are just as valid now, in 2017, as they were then.] Poole does a respectable job of satisfying both skeptics and devotees, and finds a safe middle ground where technology can be discussed on its merits, rather than focusing on the dizzying potential costs of implementing a “dream” technology set-up which, without proper training and management, would be a squandered investment anyway. 

And so, remembering that we reject utterly the quick-fix notion of more money equals more technology equals more success, we come to the issue of how schools that do not have the means to “fully” computerize can “keep up” in the 21st century.  But even this has been answered in the literature countless times.  The so-called “one-computer classroom” or “single computer classroom” is a reality in many parts of the United States.  The Internet is replete with resources that can serve as a springboard for discussion within a department on how best to maximize the use of available technologies, as opposed to how to maximize the budget for purchasable technologies.  Put another way, a purchase should not pre-date a plan.

America’s diversity is a strength, and technology in the classroom can help us tap into that strength.  But merely throwing technology at the classroom, or merely throwing money at schools and ordering them to “acquire” technology… well, that’s no better than throwing language textbooks at a child and asking him or her to “acquire” English.  Whether the school has one computer per classroom or per student, it is in teacher training, teacher education, strong supportive measures and good quality instruction that technology will find its most useful home.  New teachers should greet this technology not with a healthy curiosity and legitimate desire to test the efficacy of available programs. If nothing else, this exploration on the part of a teacher will confer a comfortable familiarity with educational technologies that can only serve to broaden the teacher’s palette of experiences from which to draw upon in the classroom, and make it much less likely that s/he will be sold on the first flashy thing a tech salesperson suggests, which would only serve to perpetuate the technological travesty of more = better. 

Educational research traditions are constantly in motion, always changing and being upgraded.  Much like our technology.  One thing all educators have in common is the desire to engage their students in the learning process.  Technology, though it be a valuable component in that process is not itself, by definition, the process.   And so above all, teachers should remember that, as Warschauer (“Demystifying”) says, technology must become “a means, and often a powerful one, rather than an end in itself.”

NoteThe above is an only slightly modified version of a treatment I wrote in 2005.  I took it out, dusted it off, and much like my recent revisiting of The West Wing, still seems downright prescient, in that it still feels incredibly timely.  Sure, much has changed since then, and the leaps in technology in the seven years from 2010-2017 far outstrip the gains of the seven year period 1998-2005, from which my initial research sources originally largely came. That this commentary is still vital and applicable is itself noteworthy.

Disclaimer: I myself am a slow adapter and a slow adopter, so maybe I am not (or maybe I am) a choice representative of all of teacherdom.  I love technology’s promise, even as its incursions make me uneasy. I’m not anti-tech, but I often find myself reflexively anti- the people who push tech. 

Do I just need to evolve into the twenty-teens?  Or is education losing its humanity in the face of a technological onslaught?

Saturday, April 6, 2013

The Prisoner - What Classic TV Can Teach Us About Tech

[Updated August, 2017]

I have a love-hate relationship with technology in education. 

I love to use technology in my classroom.  In fact, in some ways, I can’t do without it.  Not long ago, I arrived to an 8:00 class all ready to do a scintillating lesson that required use of the computer and LCD overhead projector in a so-called “smart classroom.” The tech didn’t work; there was a problem with the toggle that switched the feed from the Elmo to the computer, and I could not get the computer monitor’s contents displayed on the big screen. For a few minutes I tinkered with it to no avail.  Then I shut down and restarted everything. No dice. I even tried to go all “The Fonz” (you younger teachers fresh out of school will have no idea what I mean by that, and believe me, I weep for you…) and aside from a few chuckles out from my students, reaped no positive results. I glanced up at the clock and realized I had wasted a good 6-8 minutes, out of a 50-minute class, on this pursuit. I paused, uncertain for a few moments. More time wasted. For a short while, I felt quite stupid that my brilliant and carefully choreographed lesson was dashed against the rocks of fickle fate.

See? Technology makes me so crazy, even my metaphors are stupid.

Long story slightly less long, I called an audible, lateral-tossed the football to an alternate me (stupid metaphor #2), and got on with the class, and my extemporaneous lesson was fine. Why?  Because as much as I was hoping to be able to use the technology, I still was well-prepared and conversant in the subject matter, I knew where I was in the course sequence, I had good personal relationships and rapport with my students, and I built the course to be responsive to their needs, as opposed to dragging them kicking and screaming through the course on some pre-determined and inflexible pace. In short, when the tech failed, the human element was there to save the day.

I’m not so sure we’re headed in a good direction with tech. It’s bad enough standardized tests have become practically the gold standard for educational assessment. It’s worse that rigid adherence to bullet-point lists of standards are all that is required to “prove” to an observer that the education is sound and of good quality. In fact, in many schools, as long as your lesson plans have the appropriate sections, list the links to the State Standards, and as long as your instruction follows a particular sequence of steps and a designated format, and as long as you incorporate all the relevant trendy buzzwords from the district’s educational philosophy du jour – probably the product of the ministrations of some high-priced and charismatic outside consultant – you are a “good teacher,” and it doesn’t really matter if your students are benefiting or not: as long as your instruction at least superficially follows the prescribed norms; there just simply isn’t time to provide more thorough analyses of teacher performance, so superficial indicators have to do.

But the superficiality does not stop there. An onslaught of trendy new technological platforms, gadgets and processes are threatening to take the human element even further away from the educational experience.  Now, when most people talk about integrating technology into the classroom, they’re talking about much more than simply projecting traditional content or using web-based communications to interact with students.  Now, students can do class "presentations" with no content, but they look good, and that's just as good as an essay, right? (Buzzword of the day: "alternative assessment.")

Now there’s talk of “a tablet for every student,” or “an iPad for every student.” Wot?

Have you ever seen how well students take care of textbooks? Notebooks? Their own papers? A school I worked at recently didn't even have procedures for being compensated for lost materials such as books, and used to lose some $20,000-$30,000 per year in unreturned supplies.  Just sayin'.

How bad has our love affair with tech gotten? Now, out of expediency (I say laziness, stupidity, and is there an adjectival form of "bandwagon?"), some institutions are even starting to computer-score essays. No, not multiple-choice tests... essays. For the record, I once submitted an essay to be machine-scored.  It got a perfect score (a 6, top score on a six-point rubric).  And it was a very well-written essay. (Duh.) Except for one thing – the essay was total nonsense.  Not only was it not even remotely related to the assigned prompt, but it was not internally consistent. If an Alzheimer’s patient wrote a paper while high on cocaine (yet somehow managing to maintain good grammar, syntax, punctuation, etc…) that would have been my essay.  I did it on purpose. I wanted to see what would happen.

Never put all your trust into something that cannot trust you back.  Except my ’04 Camry.  Love that thing. [Edit - As of August, 2017, it has 294,000 miles or so on it, and is really showing its age. It will not last to the new year, alas.]

I don't think tech is necessarily a bad thing, but I do get the distinct sense that tech is being forced into classrooms because of its "cool" factor and not because students (or faculty, for that matter) are developmentally ready and primed to receive the changes. I think that can and will have disastrous effects.  Too much tech without the human element and you basically have, well… The Matrix.

At just 17 episodes, the British TV series The Prisoner was short-lived, especially by modern standards.  The fact that it’s not a title that’s on the tip of everyone’s tongue might further lend one to think it irrelevant or – gasp! – a failure.  Make no such mistake.

Number Six, the show's ex-secret-agent protagonist, does have a lot to teach us.  My last post on this illustrious television show answered that age-old question: Can’t we all just get along?  (Correct answer: What are you selling?)  In this post, I direct my gentle reader to the episode entitled “The General.”  In it, Number Six learns that a fellow named “The Professor” has created a revolutionary educational process that, by hooking students/subjects up to a sophisticated machine (called “The General”) they may be given the equivalent of a three-credit college course in a matter of minutes.  Soon, The Village (the setting of the show, basically a black site for interrogating and breaking rogue agents) is aflutter with newly erudite scholars of “Europe Since Napoleon,” the title of the first such “course.”  Number Six quickly realizes that everyone who tries to describe what they have learned in the course does so word-for-word each the same as everyone else he encounters.  Realizing that the machine completely eliminates the normally clearly demarcated line “between knowledge and insight,” Number Six correctly deduces that it is intended to be used as a form of mind control in order to control the denizens of The Village and extract information from them. The Villagers’ apparent trust in the process makes them even more susceptible, and Number Six reasons he must destroy The General (which he does, of course) in order to save both himself and his fellow Villagers.

View the full episode here

My very next post will be a research-based look at my general angst about our rush to over-technologize American public education.  It will be longer and denser, but more “scholarly.”  And it will read rather as a “Part II” to this post.  Read it!

For now, I leave you with this question: Are we going too far, too fast? Should we not pace ourselves a little bit more? Will upping the tech really change the culture, which is perhaps the real source and reason for school failure? And what exactly is meant by “change the culture,” anyway?  And why am I asking so many questions, when I only said I was going to ask one?

Thursday, April 4, 2013

Gerald Conti shrugged...

[Updated August, 2017]

Westhill High School, just outside of Syracuse, NY, is one of the top schools in one of the top school districts in Upstate New York.  And just this week (this post was initially published on April 4, 2013), one of its veteran teachers publicized his retirement letter, touching off a nearly viral outpouring of sympathy, empathy and, dare I hope it, outrage.  No, not outrage that he was leaving, just two years shy of the full retirement benefits that a 30-year tenure would bring.  Outrage at the direction public education is going.

Could this be the straw that finally breaks the camel's back?  I sure hope so.

I've been complaining about the same stuff for 15 years (read my statement of philosophy).  Behind closed doors, colleagues would listen, nod their heads, "Yeah, man, that sucks, someone should do something."  I've never been one to keep my mouth shut about things like this, so I often would do something, or say something, and then, curiously, all of the sympathetic support, the righteous anger, the steely resolve, would vanish in a puff of fear-for-one's-job.  I never knew when to shut up (if you know me, you're nodding your head), so I was occasionally, uh, "denied tenure," I think is the polite way to say it. I could never "play the game," as I was often told to do in hushed tones by my colleagues.

But Mr. Conti, he did it right.  Read his letter here.  Not in a huff or in a fury, like the Weasley twins quitting Hogwarts, but in quiet, resigned (no pun intended) sigh of disgust tinged with profound sorrow.  His letter is not Twisted Sister's rageful "You're gonna burn in hell..."; rather, it's Metallica's matter-of-fact "I dub thee 'unforgiven.'"  He simply decided it was time for him to go.

Ayn Rand wrote Atlas Shrugged in 1957.  In it she described an alternate America in which people of integrity in education, art, letters, and especially industry, are becoming increasingly forced by the government to act against their own best interests, in the name of serving other masters - the collective, the State, the amorphous entity known as the "common good."  Bit by bit, the great minds of America realize that the vocations, professions, lifestyles that they love and cherish are being ruined by oppressive and meddlesome rights-denying bureaucracy that, more often than not, doesn't really understand the nature of what it's meddling with.  In the end, these great minds decide (are persuaded, actually, you'll have to read to learn more) to "go on strike," and one by one, over a period of time, they vanish from the world, their factories, industries, mines, foundries and establishments abandoned or destroyed by their own hand.  The nation is sent into a panic, both at the trend, and at the fact that so many of the "prime movers" of what Rand called "the motor of the world" have simply vanished - ceased utterly to be productive, by choice - daring the country to try and survive without them. 

They are right. It can't.

When I read Mr. Conti's letter, I felt that sense of righteous rebellion, mixed with a twinge of sadness.  I sense the spirit of those in Rand's novel, who, realizing that their spirit is being taken away from them, decide to withhold all access to that spirit from the bureaucratic vampires, moochers and looters (some of Rand's favorite words), to give them nothing to feed on.  Lacking sustenance, maybe they would starve.

I'm guessing it was not Mr. Conti's deliberate and specific purpose to send that particular message to the Board of Education at Westhill.  Maybe Atlas Shrugged never crossed his mind.  (Maybe he's never even read it.)  Maybe, like The Prisoner, he just wanted to escape.  Either way, his personal integrity shines like a beacon for all to see. But will anyone follow?

"After writing all of this," he writes in his conclusion to the two-page letter, "I realize that I am not leaving my profession, in truth, it has left me. It no longer exists. I feel as though I have played some game halfway through its fourth quarter, a timeout has been called, my teammates’ hands have all been tied, the goal posts moved, all previously scored points and honors expunged and all of the rules altered.  For the last decade or so, I have had two signs hanging above the blackboard at the front of my classroom, they read, 'Words Matter' and 'Ideas Matter'. While I still believe these simple statements to be true, I don’t feel that those currently driving public education have any inkling of what they mean."

I've had so many colleagues tell me privately, "I really should switch careers," "This isn't what it used to be," "I don't know why I put up with this," "This isn't worth it!" and "This is just so damn depressing!"  But they stay.  Is there honor in suffering?  Is there glory in that kind of woeful sacrifice?  (Maybe it's a religious thing? Self-flagellation?)  Or is it better to send this simple message?   "No, not anymore."




Tuesday, April 2, 2013

Something Old, Something New, Something Borrowed...

[Updated August, 2017]

Students cheat so darn much!  Well, some.  I myself have seen a significant uptick these past few years, in students who are willing to go to great lengths to turn in a finished product, by hook or by crook.  The operative word being "crook."

But what happens when it’s the professionals who cheat?  Or, in this case, plagiarize?

Take Prof. Santiago (“Yago”) Colás, professor at Oberlin (formerly University of Michigan).  No, he’s not a cheater. He is a victim. 

I had the pleasure of meeting Yago online and (virtually) talking shop with him a little bit some time ago; we share a deep and abiding love of one particular short story. While geeking out and Googling information on said story, I came across Yago’s web page.  On it, Professor Colás describes a Julio Cortázar short story called “La Autopista del Sur” (English translation, “The Southern Thruway”), one of my favorite pieces. In this story, a traffic jam on a country freeway, a snarl of epic proportions, uncountable kilometers in length, lasts so long that an ad hoc society forms within and among the people trapped in their cars in the terrible jam. Time magically slows down. Friendships, kinships, romances form. There is death.  And then, suddenly, traffic starts moving again, and people go about their merry way, rather like nothing ever happened. I cannot believe no one has bought the film rights.  It seems ready-made for cinema.

Check out Yago’s web page, with his original content and analysis. [Link dead] 

Harvard Law School dean Martha L. Minow seems to have borrowed liberally from Colás’s decade-old online musings when constructing her 2010 Law School Commencement address (or when someone constructed it for her, to be fair, because Important People often do not do their own speechwriting). I have been a high school and college English teacher for two decades, and Minow's “liberal borrowing” MORE than crosses the line into out-and-out plagiarism. See the image below, which shows color-coded matching chunks of text. I have given students zeroes on essays for less obvious transgressions, and no professor I know would stand for (what appears to be) such blatant theft. 

Out of a sense of fairness, I have marked, with an underline, the lone acceptable paraphrase. Plus, the following Minow line is really nicely crafted, and, it seems, original: "The tendrils of connection forged in the crisis stretch and strain as the cars speed ahead." (See below for the context.)

For the record, I have corresponded with Yago Colás, and shown him the offending document. He recognized it for what it was, I think (I don’t want to put words in his mouth), and was pretty zen about it, telling me I basically could do with it what I wanted. This was around 2012. I held on to it for a while because I figured, what would be the point?  It would probably bode more well for my long term cardiac health if I could be more zen about it as well. But repeated goings-on at the other universities have just made me decide not to hold on to it any longer:

·         Harvard: Read about it here or here;
·         Atlanta City Schools: Read about it here or here;
·         And this scandal among med school students at Syracuse University.

I hope, as Harvard University comes down hard (deservedly so) on its students for such offenses, that they are consistent in their outrage at such behavior, and will investigate/respond accordingly. A Law School Dean should know better. As an English professor, I am personally outraged.  Same thing with the students in Atlanta – if they’re guilty as charged, throw the book at them (not that they’ll actually read the book once it hits them, though they may copy from it).  As for the Syracuse scandal, well, you can read about the fallout.

I hope Martha Minow did not write the speech herself; I really hope it was handed to her to read.  Politicians have most of their speeches written for them, so it's a distinct possibility.  Maybe then Minow has plausible deniability (“I didn’t write the speech myself.”)  Well… I had a student recently turn in a paper that he “didn’t write himself.” Guess what happened to him?