Follow by Email

Sunday, November 27, 2011

Inadequacy by Design

The most recent picture of our grand-daughter, Beatriz, shows me a little girl who does not want to go to school. I don't blame her. I didn't much want to go to school myself. I skipped school a lot when I was growing up, right up to the 15 day per term limit. I even skipped so much kindergarten that the teacher, a friend of my mother's, suggested I wasn't ready for school. (In my family, this is referred to as "when Mary flunked kindergarten.") There were a lot of reasons to dislike school -- algebra, jocks, assemblies -- but most of all it was the mind-numbing dullness of the whole institution.

The tedium of schools is ironic. Enter "engage students" into Google, and you'll get about 25 million hits. Everyone wants students to be engaged so they will in turn learn their skills, be able to locate and apply information, and think critically about the world they live in. What's more, we have learned a great deal about teaching and learning. Numerous teaching approaches and strategies actually do engage students and support critical thinking.The problem is that schools weren't designed to encourage thinking to begin with. Here's a little history:

In 1892, the Democratic Party adopted a platform plank to ban factory employment for children under 15. This was the same year that Ellis Island opened in response to staggeringly high numbers of immigrants. The question was, if all these immigrant children are coming and we can't put most of them in factories, what do we do with them? The answer was, "Get them ready to be good factory employees." Have you never wondered about the school system's obsession with punctuality? School bell or factory bell, we want students to be on time. What about following instructions? Waiting to be called on? Sound like good training for a factory worker? It's no accident. By 1892, they were reading the handwriting on the blackboard.

1892 was also the year the Committee of Ten, chaired by Harvard's president Charles William Eliot, was formed to standardize the curriculum of high schools in preparation for the onslaught of  immigrant children. Eliot argued that the purpose of education was to prepare students for their “evident and probable destinies in life.” Children of factory workers should be educated as if that were their destiny. Children of more elevated parents, typically enrolled in private academies or studying under a tutor rather than attending public schools, would continue to be educated for a variety of possibilities.

Although the demographics have changed, schools are much the same as they were in the 19th century, despite access to iPads and the Internet. The most recent major legislation affecting schools was No Child Left Behind, and its effects were just as insidious as those of the Committee of Ten. Whatever its intentions, the effect of NCLB was to honor standardized testing above any other measure of success -- which meant that teachers were focusing on regurgitation rather than thinking. Further, NCLB didn't seek excellence. Rather, the goal for each school was adequate yearly progress. Adequate!

In whose interest is this subtle campaign against thought? I would guess the same as it was in 1892. The factory owners of yesterday are the 1% of today. A thinking populace is not in the best interests of those who would maintain their elite status. Those who think are inclined to protest. Those in power attempt to put them in their places. As Carl Gibson notes in a recent article, "The recent Black Friday mobs of consumers pitching tents in parking lots and rioting over $2 waffle irons were met with silence from the police." Yet pitching tents on public property and protesting peacefully merit pepper spray and billy clubs. The difference? One group showed up punctually to follow a marketing campaign's instructions. The other group? It was just saying No.

Please consider supporting this blog by clicking on a sponsor ad.

Thursday, November 24, 2011

Of sage and saging

Today is Thanksgiving, here in the U.S., a day we give thanks for the bounty in our lives. More cynically, it commemorates our early congress with Native Americans: they gave us corn and we gave them smallpox. It is one of the better holidays we celebrate here and in my memory it always starts with waking up to the smell of sage.

Our mother made the best  dressing in the world. It was buttery, dotted with onions chopped infinitesimally small so that none of us kids would recognize and reject them, celery, and enough sage folded into the hand-torn torn bread (no cubes for us) to make us remember it all year. One of the best things was testing for sage. As mom was putting it all together in the big black skillet, she'd would call us into the kitchen -- there were six of us kids -- and ask, "Does this have enough sage?" We'd each get to try a bite. I always told her "no" so I could have another taste after she added more, and I suspect my siblings did as well. And then we'd watch as she stuffed the big gruesome bird and staggered with it to the oven.

Later when we had our own homes the recipe began to change. Suzy added mushrooms and walnuts. I adopted her changes but added bulk sausage which introduced even more sage. Pete even made a stuffing with feta cheese one time. Nancy, more of a family purist, would say, "Please just make it the way mom did!"

As I've wandered in and out of various spiritual journeys, I've  adopted other ways to use sage, especially in cleansing ceremonies. I'm not sure what the reaction between negative energies and burning sage is, but I know it makes me feel safer and closer to the times when humans first threw dry leaves onto their fires and discovered their sweet and bitter aromatics. One friend tells me that the burning of sage helps us wash off the world before we enter sacred space.

Today when you are making dinner, sprinkle a little sage on the burner, for what is Thanksgiving but a sacred space in the year, a time we celebrate what and who we have? I'll be remembering my childhood today and be thankful that memory is so tenacious that I can clearly see the parents and sisters I've lost, and the brothers and sister I still have. We think we only remember in words and pictures, but there is a primal connection between whiffs of smell and episodic memory. Today in  my kitchen, I'll be breathing deep into the jar of sage and sprinkling some on the burner to summon the memories I cherish and keep at bay whatever was not love.

Please consider supporting this blog by clicking on a sponsor ad.

Monday, November 21, 2011

Characters Behaving Badly

“The Well of Lost Plots is where vague ideas ferment into sketchy plans. This is the Notion Nursery: The Word Womb. Go down there and you’ll see plot outlines coalescing on the shelves like so many primordial life forms. The spirits of roughly sketched characters flit about the corridors in search of plot and dialogue before they are woven into the story. If they get lucky, the book finds a publisher and rises into the Great Library above.” from Lost in a Good Book, by Jasper Fforde.

Last week. You will recall, I was re-reading The Eyre Affair, by Jasper Fforde. When I finished, I couldn’t help myself and picked up its sequel, Lost in a Good Book. In this volume, Thursday Next is moonlighting as a Jurisfiction operative, policing crimes which take place within books. Not crimes like murder or theft, mind you. As apprentice to the intrepid Miss Havisham on occasional leave from Great Expectations, Thursday faces such duties as preventing minor characters, bored with their roles, from book-hopping without the proper permissions.  In the world of Jurisfiction, books and their characters have little to do with their authors, who seem almost incidental. 

This is how I feel when I’m writing. The characters are stubborn at best and out of control at their very worst. They don’t seem to care what direction I think I’m going; they just do what they please. Unless you write fiction, you have no idea how disconcerting this can be.

I didn’t experience this phenomenon until my second book, High Spirits at Harroweby. My protagonist, Selinda Harroweby (who was to have been modeled on William Congreve’s poem about “Pious Selinda”) has just fled to her chamber in a temper when there is a tap at her door. I hadn’t planned this, but thought perhaps the entry of a trusted servant would be a good opportunity to reveal back story. Instead, in walks a little girl in a nightcap. I had no idea who she was as first, but I soon learned that she was Selinda’s little sister, Lucy – and she had to be worked into the plot.

This happens to writers all the time:

It begins with a character, usually, and once he stands up on his feet and begins to move, all I can do is trot along behind him with a paper and pencil trying to keep up long enough to put down what he says and does.
-          William Faulkner
One of my standard -- and fairly true -- responses to the question as to how story ideas come to me is that story ideas only come to me for short stories. With longer fiction, it is a character (or characters) coming to visit, and I am then obliged to collaborate with him/her/it/them in creating the story.
- Roger Zelazny
On my one and only longish story my characters surprised me several times by explaining to me in words of one syllable how they should behave and what their back story was. It surprised me at the time. I suspect it's universal.
  - Geoff (from a post on a website for transgender writers)
It’s good to know I’m not alone, because it happened again this week. I am working on another mystery, The Ghost Doll. My protagonist, Liz Venables, is a sculptor. In the opening scene, she is working on a new piece for an installation. She is hanging upside from an overhead pipe and wielding a blowtorch to reshape some wire mesh. This sort of thing is hard enough to describe without stretching the reader’s credulity to the snapping point, but all of a sudden in walks someone – man or woman I’m not sure – who greets Liz, “Hey there, monkey girl?” Some character I haven’t planned for has just walked into the novel. Friend? Sibling? Soon to be ex?  No one else spouts that kind of talk.

I certainly have to stop and re-think now. S/he’s here for a reason and will just have to continue the scene to find out what I need to do to the plot. Like Thursday Next, I am stuck in the Well of Lost Plots. Chapter I is going to take a lot longer than I thought!

Saturday, November 19, 2011

Grammar makes liars of us all

When people find out that I write fiction, they often ask where I get my ideas,  how I come up with characters, where I get their names, how I found an agent, etc. I always answer, but the odds are I'm lying.

Deirdre of the Sorrows
Here's what I mean. The main character in The Fool's Journey is named Deirdre Kildeer.  I loved the name Deirdre when I was a romantic adolescent and, because the character Deirdre changed her identity when she was young, I reasoned that a girl with poetic tendencies like my own might choose a name associated with myth and tragedy. (I also liked the name "Antigone," but it doesn't look as good on the page.) Deirdre's surname began as Kildare. She is also a university professor and I realized that students might address her as Dr. Kildare -- which raised specters of Richard Chamberlain in his 1960s series of the same name. Alike in sound, "Kildeer"  popped into my head, perhaps because my friend Meg is from Kildeer, North Dakota. This worked well because Kill+deer (dear)  added  psychological symbolism into Deirdre's choice of name.

This explanation is sheer invention. Very little of my character-naming process was a matter of decision. As I wrote, synapses fired and connected and I went with them. That is what we call inspiration, or creativity or the Muse. It became fiction when I superimposed the sequence and grammar necessary for explaining how I came up with the name.

Inventive thought is interior, full of words and images. We make sense of the mental melange and are able to communicate it only when we add structure. The explanation above contains elements that are true, but are only one version of what happened in my brain when I was writing. To me, the interesting thing is that ideas will never come to anything unless you translate them -- in jotted notes, word maps, outlines, conversation -- by adding some kind of grammar.  If we didn't need to tell anyone anything we wouldn't need grammar. Interior thought would be sufficient. James Britton, who called this inventive phenomenon "shaping at the point of utterance," encouraged teachers to allow and support the inventiveness that arises out of articulating semi-formed ideas in both numerous drafts and in conversations about them; this, as opposed to lock-step manner of "first draft-only draft-done" so many of us suffered in our schooling. I might also add, that if we must teach formal grammar, a focus on how its structural components help us communicate would not go amiss.

Wednesday, November 16, 2011

At last -- freedom from hierarchy

In the 1980's, scholar James Burke created a show for PBS called Connections. In the course of each episode, he would demonstrate the interconnectedness of cultural and scientific evolution, noting that "there is always a connection but, if the link has never been made before, nobody knows it's there." Through a circuitous examination of ideas, Burke would reveal such connections as the use of cobalt in dyeing Ming vases to its importance in the manufacture of computer chips.

 Popularizing this kind of thinking has led to numerous insights about the importance of personal idiosyncrasy and historic serendipity in the analysis of technological progress, but it is also illuminating to see how the development of ideas in education can also be a matter of chance--or even destiny. Let's try it, and I promise that I will arrive at something thought-y.

Two years before Abraham Lincoln's 1863 Emancipation Proclamation, Tsar Alexander II of Russia issued a Declaration of Emancipation for his feudalistic empire's serfs, bonded laborers whose crown-enforced duty was to work for landed nobility in exchange for protection. The entry of an entirely illiterate population into society was fraught with problems, many of which were analyzed in essays written by novelist Leo Tolstoy. In these essays, Tolstoy laments the inability of former serfs to understand why written stories hold more value in society than those circulated in oral tradition, and speculates on ways in which education might help move the children of serfs forward into a more enlightened mindset.

Fast forward sixty odd years to young Russian cognitive psychologist Lev Vygotsky, who reads Tolstoy's essays and uses them to round out his own ideas on literacy acquisition. About the same time, American educator John Dewey visits Russia, meets Vygotsky and they discuss the ideas of the Pragmatic Movement. As a result of this conversation, some have suggested, Vygotsky began to articulate his ideas about the Zone of Proximal Development and scaffolding in terms that were less speculative and more concrete.

So, as Robert Frost reminds us, “way leads onto way”. It’s interesting to me that the way events unfold is similar to how the brain works: interconnected pathways of information snap and fire, allowing new insights, raising new fears and enabling new ways of thinking.  I love the lack of hierarchy both in Burke’s view of history and in cognitive function. We think we need hierarchy, but it’s really just there to help us categorize and explain. Nothing really happens that way. Life is a whole lot more like a collage than an outline. How liberating!

Tuesday, November 15, 2011

The Silly Putty of Memory: Neuroplasticity

About three years ago, my husband Jose took a spill down the basement stairs which resulted in a traumatic brain injury.  It was terrifying and fascinating. After several generally non-responsive days in the hospital, they advised that I put him in an adult care home. I knew there was no future there and took him home to heal.  He started to improve within days, but we ended up going through six weeks of amnesia together.

Jose remembered his languages and could respond to questions in English, Portuguese, Spanish and French, but everything else was slow: how to move around, talk, think. A native Brazilian, Jose had been in the States for about twelve years at that time,  but he didn't remember it. As he began to talk more, I learned that he thought he was in Sao Paulo. He was angry that I wouldn't drive him to his home in Rio. He knew the dogs (of course) and he knew that he should know me, but it took a long time before he began to put all the pieces together. Slowly, his life took shape again. He's fine now, except for a diminished sense of smell.

I've been reading about the brain ever since. I highly recommend The Brain That Changes Itself: Stories of Personal Triumph From the Frontiers of Brain Science, by Norman Doidge. The author recounts remarkable stories of people who have overcome enormous physical and mental disabilities through the emerging science of neuroplasticity. There's the woman who, after an injury to her inner ear's vestibular system, feels herself in a perpetual free-fall, unable to walk or perform any task which requires a sense of balance, a stroke victim deprived of the use of his right arm, even an amputee suffering from phantom limb pain. Through the efforts of pioneers in emerging brain-mind science and the experiential exercises they devised, all were able right themselves.

Doidge goes even further, though, as he demonstrates that our thoughts can change the structure and function of the brain and its limitations. In education as in other fields, we've learned that the brain is essentially unchangeable. We've generally been encouraged to help students - especially the learning disabled - to find strategies that leverage their existing abilities rather than changing the brain itself. Part of the problem, he argues, arises out of the common metaphor which compares the brain to a machine, and prompts us to use such terms as "hard-wired" and liken synapses to permanently connected circuits.

In the book's preface, he states:
[Neuroplasticians] showed that children are not always stuck with the mental abilities they are born with... In the course of my travels I met a scientist who enabled people who had been blind since birth to begin to see, another who enabled the deaf to hear; I spoke with people who had had strokes decades before and had been declared incurable, who were helped to recover with neuroplastic treatments; I met people whose learning disorders were cured and whose IQs were raised; I saw evidence that it is possible for eighty-year-olds to sharpen their memories to function the way they did when they were fifty-five.

I know what you're thinking, but Norman Doidge is not just some neuro-scientific version of Shirley MacLaine. He is on the Research Faculty at Columbia University's Center for Psychoanalytic Training and Research, in New York, and the University of Toronto's Department of Psychiatry. The scientists whose work he explores are leaders in their fields, often Nobel laureates.

I find Doidge's book enormously liberating for myself (perhaps I am not stuck with my mental short-comings) and for the field of education. Finding that the brain changes physiologically as a result of experience makes me even more committed to such approaches as project based learning, but it also prompts concern that we are not exploring the possibilities and implications of experiential learning to the extent we should. Are we limiting efforts by focusing primarily on content standards? Are we ignoring the standards of the mind?

Monday, November 14, 2011

A little knowledge is a dangerous thing

I am in word mode today. This happens occasionally when I come across a word I love and it leaps from the page begging for me to use it in a sentence. Today, it's gnosticism. Gothic to look at, difficult to spell and define. We all know (or at least use freely as if we did) agnosticism -- some sort of socially acceptable form of atheism  --  but what, pray tell, is a gnost? It doesn't sound good or look good, if you come from the gut-level response school of etymology. However, sounds and appearances are sometimes deceiving. (Enervate, for example, sounds like it should mean "stimulate." Doesn't, though. Just look it up.)

The root gnos derives from the Greek to know. This is actually where the English word know comes from, but for fun, we changed the the g to an k, though we retain it in prognosticate, ignorance and diagnose.  OK. But gnostic? Gnosticism? The ic and  ism of knowing? That seems a little all-encompassing --- and it should: Gnosticism seems to be not just knowing, but beyond knowing: the essential, the essence, the esoteric nature of spirit and spirituality. Gnosticism, in many of its incarnations, rejects the material world and in some cases, most notably A Course in Miracles, suggests that the material world and all of our supposed experiences are the soul's nightmare. Pain, greed and evil didn't arise out of sin -- we made them ourselves out of fear. The Course, its followers claim, is the word of a disappointed Jesus revealed to a pair of transcribing physicians at Columbia University.

Gnosticism isn't new. We can find traces of it all the way back to Zoroastrianism. Interestingly, for a term that seems -- to me at least -- quite innocent, gnosticism has a lot of detractors. It even has its own heresies led by "false teachers." Christians as far back as St. Paul are spooked by it ("knowledge falsely so-called" in 1 Timothy 6:20). You have to ask, what is so scary about gnosticism, this intuitive understanding of spirituality? Too much independent thinking outside the Book? Not enough rules? No support for Bingo nights or collection plates? It's certainly tempting to think so.

So, gnosticism. My word of the day. Maybe next time we'll do silhouette. I love how it's spelled, the little whisper of h, how it looks on the page, how ...

Saturday, November 12, 2011

The Dangerous World of English Literature


In preparation for starting another book, I've decided to indulge myself re-reading some of my favorites. I started last night with The Eyre Affair by Jasper Fforde: surrealism at its most entertaining. The story takes place in an alternate England of the mid-1980s where literature is the obsession that drives a nation. Protagonist Thursday Next, a literatec (literary detective), is hot on the trail of arch-villain Acheron Hades who has wickedly stolen Jane Eyre right out of her novel! This is a world where Shakespearean and Marlovian adherents duke it out on street corners, manuscripts are held hostage and family pets are more than likely genetically re-engineered dodos named Pickwick. In the world of The Eyre Affair, people live -- and are shot dead -- for literature.

I adore almost everything about Jane Eyre (except for some of the movies based on it, especially the most recent!) and the Brontës (except for that poor excuse, Branwell). My first dose came with the Classics Illustrated comic version (these may have been early versions of Cliff's Notes) belonging to one of my older sisters. Why should it have been so thrilling? Is it a testament to Charlotte's genius that I could be mesmerized by a mere outline of the story and some really, really bad art? What other explanation is there for my living and breathing Jane Eyre and being terrified at the notion of any attic until I was well through my adolescence -- this despite the fact that I actually never read the book until the summer I was fifteen. Or is this phenomenon merely predictive of one's becoming an English lit major and talking funny the rest of her life? (Some people claim they can actually hear my semicolons!)

I don't know for sure, but my fixation with Jane and her various dilemmas and odd quirks cannot possibly have been good for me. My teens were not normal: I lived in a different century. I my big crush was a fictional hero. I studied the speech and manners of all the characters until I could have passed for one of them.  This obsession (along with its counterpart, Pride and Prejudice) served me well in later years as a writer of historical romance, but I know my classmates at Kellogg High thought I was downright weird. They were right. Later, when I finally had my first literature classroom and introduced my 11th graders to Jane Eyre, they proved undeserving of the honor: they mocked the dialogue, questioned the period's moral scruples and called the work Jane Air-Head. I decided then -- and stuck to it for almost a year -- to never teach anything I cared about.

In the years to come and thanks to numerous mentors (thank you, Don Graves and Jane Hansen), it became clear that every student needed the freedom to follow their own obsessions in their reading and writing choices and learn what they could from them. I found out I couldn't make them care about what I loved. I could only create environments where it was all right to care about just about anything. That insight saved my teaching and possibly my students.