Saturday, August 10, 2013

Chatham Reflections

When I was twelve years old, I read Origins Reconsidered by Richard Leakey.  Richard Leakey is a paleoanthropologist, most notable for his discovery of a nearly complete juvenile Homo ergaster skeleton near Lake Turkana, Kenya.  The book was about his fossil hunting in East Africa, human evolution, and, to a lesser extent, religion. His description of the scientific mindset, the mystery of life, the richness of hominid evolution, and the myopic worldview of the majority, was the most consequential moment in my formative years.  I've been an enthusiastic student of science ever since.  In my mind, nothing even comes close to offering such a substantial, interesting, complex, and nuanced worldview. 

I view my goal in education is generate this interest and enthusiasm for science in my students. There are several obstacles to overcome if I am to be successful.  One is, how do offer a compelling answer to the ubiquitous student (and adult) question: when will I use this?  I could offer a myriad of examples as to how science has increased humanity's well being: vaccines, engines, computers, etc.  But not everything we teach and learn has this explicit practical value, and for every answer we give to that question we are implicitly stating that certain other things are not important.  "Ok," the student may reply, "I can see why computer science is important because they have become omnipresent in our lives, but what about learning the micro-architecture of a moon snail's radula?"

Giving this type of answer buys into the premise that knowledge is only important if it has utility.   A part of me wants to answer the question the way a former editor for the New Scientist Magazine did, "I think science is interesting, if you don't agree then fuck off."  Of course, that wouldn't be very prudent or productive, but it is true that if we keep trying to justify what we teach based on what we can do with that knowledge, we will be doing a disservice towards our students. The best answer I can think of is:  Would you rather look at the world through blurry eyes or clear eyes? Learning is giving you a better eyeglass prescription, allowing you to increase the resolution of the world you experience.  

This attitudinal obstacle may be the biggest obstacle of all.  I don't believe our educational system is fundamentally flawed.  Sure, increase student-centered activities if it increases achievement 3% compared to a control group in a few methodologically flawed research studies.  This debate of traditional vs progressive education is interesting, but I don't believe its going to fundamentally change the downward trajectory of education.  It's like debating a change of course on a sinking ship.  Compared to generations past and other countries, our students have access to one the best educational systems that has ever existed.  

There is a fundamental lack of interest in learning academic subjects.  Our society values entertainment more than education, and this is why our students are not succeeding at the rates we wish.  How many adults read science books, watch documentaries, visit art museums, and go to symposiums more than they watch football and reality TV contests?  It's been said every generation laments the new generation's focus on entertainment compared to education and work ethic, but has any generation before now had access to the variety and quality of  entertainment we have 24 hours a day?  It's absolutely corrosive to the academic mind, the mind we as educators are in charge of fostering.

Zooming out  further, there is no doubt that historical and economic inequalities are at the hearts of much of the academic achievement gaps found in this country. Paradoxically, I don't think educational efforts are going to fix this educational problem.  If students live impoverished lives where the only vibrant economy in their area is the drug trade, then it's going to be hard to convince them to buckle down and focus on their studies.  These are vestiges of racism and runaway capitalism, and how to fix these problems is not so clear. 


Sunday, July 28, 2013

Requiem For Deep Thought

"Technology giveth, and technology hath taken away" - The Bible, or something. 

I was a fan of Nicholas Carr's 2008 Atlantic article, "Is Google making us stupid?."  It articulated all the worries many of us have been feeling as our identities and thought processes have slowly, but inexorably, become enmeshed and continuous with the internet and the devices used to access it.  The Shallows expands upon the article and creates a compelling, yet measured, case we are not just gallivanting about the internet due to the accessibility of information and streamlining of communication, but that its sticky silk has trapped us in the web and is changing the very way we think.

The core of Carr's argument centers around the internet's effect on our attention span.  The longer we can focus on one subject, the greater the chances we can copy it into our brains and integrate it into our intellectual frameworks to create what we call an "understanding."  Without this sustained concentration, we are simply learning bits and pieces of information, without making the connections required deep learning. He supports this thesis with a variety of scientific studies and anecdotes.

For example, the phenomena of hyperlinking is ostensibly one of the greatest boons of internet reading compared to traditional book reading.  We have always intrinsically known the the vast connections between the various content we take in, but hyperlinks allow us to walk along these connections whenever the impulse emerges.  This may be a good thing, allowing us to create a more robust intellectual framework with a wider array of connections; in essence, a better understanding.  So is this the case?  Carr discusses Diana DeStefano's review of the the hyperlink literature, and it clearly shows a negative correlation between the number of hyperlinks and the comprehension of the reader, regardless of whether they actually clicked on it.  Why is this so?  The idea is that each hyperlink forces us to take a break in our reading and make a decision: to click or not to click.  This does two things. One, it distracts from the thoughts and ideas being communicated by the text.  And two, it adds more information to our working memory, creating cognitive overload.  Both of these effects decrease our ability to transmit information into our long term memory.

This vice of the internet is the same as its virtue:  it offers too many choices.  This creates what psychologist Barry Shwartz calls "the paradox of choice."  We are given so many choices that it creates an anxiety that we are have not chosen the "correct" link, webpage, music, movie, etc.  We cannot just sit there and enjoy something for a sustained period of time because the background anxiety builds that we are not using our time wisely and we must continue the search. Carr presents the case that this is a wide ranging phenomena amongst those of us who have drank deeply from the internet and tasted its crisp Pierian spring.   It is especially true of myself, which is why I find the book so compelling.  This loss of attention span has happened to me once before, so I am a bit more wary and cautious of the internet, having experienced life without much of an attention span.

When I was younger, I was an avid chess player.  I read books on openings, closings, strategies, tactics, and the great chess games of all time.  I would sit there and play a game of chess for two to three hours, focused deeply on the game. I can still remember many of these games.  I actually got pretty good after a while, playing in state tournaments and once beating the number one ranked chess player in Pennsylvania for my age group.  Then (dun, dun, dun...) I discovered speed chess.   I fell in quickly and deeply.  In the speed chess I preferred, the players have only one minute total to move their pieces.  I eventually managed to increase my speed to about one move per second.  While I greatly enjoyed the excitement and quick thinking, I began to feel its effects on my longer chess games.  I could no longer sit there and think moves ahead.  Simply, I became very impatient and distracted.  This transferred over to my normal life as well. Waiting in line, sitting in traffic, opening movie credits, books, conversations-everything seemed to take too long.  I eventually came to the conclusion that this deterioration of my attention span was no good, so I gave up chess and have barely played in the last ten years.

Well, I feel it happening again with the internet, consistent with Carr's experience and the research he cites.   In order to once again scuba dive with words instead of jet skiing across the surface, i.e., to write the book, Carr had to move to the mountains of Colorado and disconnect himself from the new technologies.  This is basically what I have been doing for the past few years, only in moderation.  I still do not have a smartphone, because it will be the end of me. I'll be playing endless games, checking my email, or texting while walking down the street instead of enjoying the swifts and swallows.

Slate referred to The Shallows as "the Silent Spring for the literary mind," and this is an apt description.  I recommend this book to anyone, not only for the excellent prose, historical context, and review of the literature, but as a catalyst to moderate the use of the internet and reconnect with the world in a deep, richer, and more meaningful way.

Now, its time to scan political and scientific blogs until I pass out from a migraine.

Saturday, June 22, 2013

Warlick Response Condition 1

First, I'd like to thank Mr. Warlick (known in the Matrix as The Developer) for responding to my hastily prepared and informal response to his hastily prepared and informal synopsis of his larger talk.  
Condition 1, preparing children for an unpredictable future.

My point is that in the 1950s, '60s and '70s, we didn't know we were preparing students for such rapid change and there was little in the institution designed for such a task. Today we do know that rapid change characterizes our children's future, yet, too much of today's educational practices continue to be based on the assumptions of a half century ago. If we truly understood this one fact, that we can not describe the future for which we were preparing our children, then it would have a profound impact on what we teach, how we teach it, and the very roles of teacher and student.

I'm not entirely sure how much I agree or disagree with this idea.  That said, I do think David is making a leap of logic when discussing predictability and change.  Rate of change and predictability are two different ideas, and its not possible to deduce one from the other.

For example, one can observe slow change that is unpredictable.  A drunkard slowly stumbling down the street is changing his location slowly, but often unpredictably.  A better example would be evolution by natural selection.  Drastic morphological changes take place slowly over geological time scales, but how an organism is going to specifically adapt to an environmental challenge is often unpredictable beyond the most broad generalities.

One can also observe fast predictable change.  For example, the progression of many cancers, e.g., pancreatic, are fast and predictable.  A more operative example, technologically speaking, would be Moore's Law.   Gordon Moore predicted, almost 50 years ago, that the number of transistors per unit area of an integrated circuit would double every two years.  This prediction has been uncannily accurate, and only recently it is being discussed the the rate may be slowing to a doubling every three years.

So what type of change are we experiencing? Fast-unpredictable, slow-unpredictable, fast-predictable, or fast-unpredictable?  I haven't given this any deep thought yet, but I would think today's change most closely approximates fast-predictable change.  The reason for its fast pace is also the reason for its predictability: the information explosion our planet is experiencing.  This is allowing us to understand how the world works to a high degree of accuracy.  This understanding is leading to the fast pace of technological change, but also allowing us to predict more accurately what is possible in the immediate future.  That is, as our knowledge matures, our technological abilities and our predictive abilities increase hand in hand. It does not seem very likely there is going to be a major paradigm shifting discovery anytime soon, given how accurate our scientific theories are at present, but there are exceptions such as AI, abiogenesis, and complexity theory. I'm certainly not saying we have a complete picture of the world, but we are approaching that line asymptotically.

(I'm going to use a long parenthesis, a la Stephen Jay Gould, to dispel a predictable objection to this line of thinking.  It has this general form: "People once believed the sun revolved around the earth and that their knowledge was close to complete; look how wrong they were!  Who knows how different our understanding will be in 500 years? It is pure hubris to think we have a better understanding of nature!"

This objection quickly deteriorates upon closer scrutiny.  When one begins to embark on a serious study of anything, they are initially going to have some very wrong ideas about its nature. For instance, let's look at Sugata Mitra's experiment about the computer in the wall.  The children initially tinkered around with the device, not knowing much about it, and displayed some wrong ideas with regard to how it worked.  After some time, manipulation, collaboration, and study, they began to understand it more and more until they were able to operate it about as well as any other laymen.  Are we to say that just because they were very wrong at the beginning, and there understanding has changed so much since, their present understanding will also prove to be very wrong and will change just as drastically in the future?  Of course not.  And the same is true with humans general study of natural phenomena.  We truly are understanding the world much more accurately, so our technologically abilities and predictive powers should also be increasing at about the same rate).

Hopefully, I'll be able to respond to the other points in full, but free time is a virtue at this moment.

Thursday, May 30, 2013

Warlick and Weinberger

I apologize if I begin to sound like a crotchety old man in these posts, but I am having trouble getting very much of this information through my overly cynical mental filters.

Warlick

First off, I appreciate David Warlick's dedication to improving education through the integration of technology over the last three decades; it is to be commended. That said, even though he is only giving an informal synopsis of his longer talk, I cannot imagine increasing the length would make it anymore comprehensible.  The talk seems to be a bag of poorly conceived ideas, unconstrained by the strictures of reality, thrown together in a haphazard fashion.  I bet that he actually has some interesting things to say about the subject, but like so many popular educational thinkers nowadays, he frequently overstates his points and tries to wax philosophical without the required logical arguments.  Let's look at his main points.

He believes there are three converging conditions that are going to have to change the way we view education.

1.) "We are preparing kids for a future we cannot clearly describe.  This is the first time in history where we are facing a situation where we do not know our children's future.  We do not know the future we are preparing them for." 

I sure hope that David is (was) not a history teacher (he wasn't).  If he was, he would realize that we have never known or could predict the future (technologically, economically, socially, nationally, or otherwise) with any good degree of accuracy.   Here is a PBS Technology Timeline.  What events in this timeline were predicted twenty years before hand, with accuracy?  Why is today any different? We prepare students for the every changing present and the immediate future, adjusting instruction along the way if needed.

One of the problems with these talks is that no one seems to understand how to structure an argument or present a proposition with nuance.  If he were to say, "The future is more uncertain today than it ever was in the past because of (insert premises from which the proposition can be inferred) " then maybe I could agree with his point. But the world has been progressing pretty darn quickly since the Enlightenment, and the pace of progress is surely accelerating as complexity tends to build on itself (plus, as the population grows larger, there are so many more lives being lived in a given amount of time).  But, again, there has always been uncertainty and I can think of no reason why this next generation is going to experience change drastically different than the few generations preceding it.

2.) "Our kids are different...They spend a lot of time online, their playing video games, their engaging with online communities. And they have come to understand information differently than my generation. "

I don't believe students have fundamentally changed in one generation.  Yes, they are online more than we were at their age, but does this imply they are fundamentally different and need to be instructed differently?  I don't buy it.  My dad's generation watched a lot more TV than his parents' generation, and they in turn listened to more radio than their parents. Either way, every generation still gets their information from reading, listening, watching, and doing.  I have not yet heard of students having infrared binary information beamed directly onto their retina.

Students certainly have more access to information than previous generations could have ever dreamed of, but the saturation point was exceeded long ago, before the Library of Alexandria burnt to the ground.  That is, we have had way too much information for any one person to process for thousands of years, and piling on more information does not profoundly change this dilemma facing us as individuals.

He speaks as if this new generation experiences the world in some strange, wonderful, or magical way.  I could almost guarantee they process the world the same way we do, but without the memories of how difficult it used to be (in hindsight) to access the vast information our species has acquired.  Information is still a product, but an increasingly free product.  And it is still manipulated in pretty much the same way it always was. What does he think students are doing with this information that is so unique?  Talking about it? Writing about it? Sharing their ideas? Combining different insights? Generalizing?

[added by edit] I think I was wrong to say that students are manipulating information in same way as previous generations.  While I still think this is true with regard to the content conveyed by the information, i.e., they are still just reading it, critically analyzing it, discussing it, etc. albeit more extensively and easily, what is innovative is how they can organize this information into personalized taxonomies.  These taxonomies can then evolve in tandem with others'.

This will not only provide an external framework for storing, retrieving, and sharing information, but can also provide a new and interesting internal framework for the same purpose.  This is essentially what it means to understand something, i.e., integrate it into an internal hierarchical framework to add context and establish connections with other information. Whether the distinction between content conveyed and its organization can be justified, I am not sure.  If not, then it may be perfectly legitimate to say that they are actually changing knowledge by adding descriptors, tags, hierarchies, etc.

Is this a good thing?  I am not sure, as there are already well established frameworks ("tags,") for retrieving this information that was developed by experts while the content itself was being developed.  For instance, if someone tags a Lubber Grasshopper as a "bug," it will true in the colloquial sense, but not in the entomological sense.  Maybe it would be better for experts to organize their information, and for amateurs to learn how to navigate their frameworks; as I mentioned earlier, these frameworks are probably continuous with the content, so having an improper framework is deleterious to comprehension.  [/added by edit]

3.) "The information. We have this thing called Web.2.0...the very shape of information is changing.  The information is more of a conversation...Now we are increasingly going to the community to get the answer."

From what I can understand with regard to this segment, David is saying that people are collaborating more in order to find out the answers to questions that no authorities can help with.  I can't disagree with that, but information is not changing into an open conversation for all fields.  Yes, at the cutting edge of knowledge, where David thinks he is, the internet allows more collaboration at faster rates and the information gets modified more quickly.  But for most of what students are learning, the knowledge is still the product of thousands of individuals over thousands of years; it is not new and developing.

The video starts getting really strange at this point. He tries to define a new definition of literacy in the age of information, but simply states goals that have always been important in education.  He states the new literacy is defined by the ability to:

1.) Expose the truth:  i.e., critically evaluating information, which dates back, formally, to at least the Greeks.
2.) Process and employ information: how is this only applicable in our new world of easily accessible information?
3.) Express ideas compellingly: Ibid.
4.) Use information ethically: Ibid.

Weinberger

The David Weinberger podcast was a little more restrained and I really couldn't disagree with much except their armchair philosophical claims that the nature of knowledge is radically changing.

The hyperlinking phenomena is great, and does show explicitly the limits of trying to demarcate knowledge.  But still, we do need to demarcate it for any practical purpose.  I think one of the best uses of this new accessibility to information (hyperlinking, videos, forums, etc.) is the interactive, multimedia e-book.  Some things are best shown in video format and/or through the use of an application, and e-books can integrate this seamlessly into a text.  For instance, Carl Zimmer's evolution textbook Evolution: Making Sense of Life and Richard Dawkins The Magic of Reality both employ various multimedia and hyperlinks to convey their information.  Once this art is perfected, I predict it will make learning the material that much easier for the students.

David and Alan also discuss the fact that the act of learning is becoming more public as students participate more in online blogs, forums, and chatrooms. I would have to agree that this is something new and interesting.  Learning should become easier if students can get most questions answered immediately by referencing previous online discussions about the same query.  It will be a great resource, but I think its just going to expand upon classical methods of learning, not fundamentally change the game. (Or "change the nature of knowledge as we know it")  Who knows though?

Tuesday, May 21, 2013

On The Origin of Speciousness

Somedays I have to wonder whether I am living in a bizarro world; today is one of those days.  After watching Sir Ken Robinson's How to Escape Education's Death Valley and Seth Godin's Stop Stealing Dreams, my mind feels numb from the number of falsehoods passed off as truths and the number of banalities passed off as profundities.  I can only surmise after viewing these lectures that Ken and Seth's only experience with education is watching Ben Stein in Ferris Bueller's Day Off, as this is the only way I can understand their straw man mischaracterization of the educational endeavor.  Disentangling and refuting all the nonsense is beyond the time and space available for this post, but I will highlight some examples.

Ken has 3 principles that he deems crucial for the human mind to flourish.  Let's look at these.

1. Human beings are naturally different and diverse.

During this section, Ken makes the half-banal, half-false statement, "If you have got two children, or more, I bet you they are completely different from each other." This gets applause for some reason. Wilson Bentley, or the "Snowflake Man," said of snowflakes, " [Even though no two snowflakes are identical,] it is not difficult to find two or more crystals that are nearly, if not the same, in outline."  And so it is with children.

All human beings share the 99.9 percent of our DNA, and our neuroanatomy is similarly identical. Most traits used to characterize human beings fall on a normal or Gaussian distribution curve.  There is a broad range in the middle where most students are very similar with regard to intelligence, aptitude, enthusiasm, and achievement, and this is the general area where daily instructional strategies should be focused.  Is Ken suggesting that we customize an educational program for every minor difference between students?  Does every student need custom sewn clothing because they are "completely different from one another?"  Of course not, there are broad groups that are very similar and can handle similarly sized clothing, just as their are broad groups in a classroom that can learn from similar instructional strategies.

2. If you can light the spark of curiosity in a child, they will learn forever without further assistance.

This statement is a good example of the problem with educational "experts" like Ken and Seth; namely, where is the evidence for this statement.  Where is the research or data? Is it a priori true? Is it a tautology, i.e., if they aren't life long learners you by definition didn't "light the spark of curiousity?" What value does this statement have besides getting applause?

The real question is, will children naturally learn at the pace we want them to and the content we want them to learn? Will they eventually stagnate? This is very similar to the question: what is the best instructional strategy, constructivist or direct instruction?  Direct instruction does better by all measurements, as John Hattie showed in his landmark meta-analysis study.  Harvard Professor Jeanne Chall also comes to the same conclusion in "The Academic Achievement Challenge."  If student's guide themselves in their learning, I doubt they will enthusiastically learn arithmetic, algebra, trigonometry, and calculus based on their natural curiosity of these subjects.  Some things we learn are not immediately gratifying, but are extremely important and useful nonetheless.

And with regard to STEM curriculum and the standardized testing, I agree this is not the end-all-be-all of education, but they are extremely important if students are going to be able to compete in the workplace in the future. It is our responsibility to give them the opportunity to excel in these subjects more so than the arts and humanities, as projections don't show a marked increase in the demand for literary critics in 2035.

Excelling at standardized tests shouldn't be the only goal of education, but being able to do well on these tests is certainly necessary, in most cases, for showing proficiency in the subject matter.  Not sufficient, but necessary.  The fact is, these tests are not that difficult if you know the subject matter for the grade level being tested.  Failing these tests is a big indicator one is not learning the material.  What other, objective way should we measure their intellectual achievement?  The teacher's vague impression of their ability and knowledge?

3. Children are naturally creative.

As with his other statements, it's hard really to judge whether one agrees or disagrees with it because it is so general and devoid of content.  Yes, I guess they are creative?  Are they very creative or a little bit creative?  Are some not creative? Are they creative in ways that we want them to be, as in mathematics, or are they creative in the way they kill ants?

This post is already getting too long, so I'll try to be quick with regard to Stop Stealing Dreams.

"And then [Horace Mann] needed more teachers, and so he built a new school for teachers.  Do you know what it's called? The normal school.  He called it the normal school, where they train people to teach in the common school,  because he wanted you to be normal."

Where is he getting this from, exactly?  Normal schools were not created by Horace Mann in the 19th century, but were a 17th century French invention; they were called ecole normale. The point was to create a standard or norm of excellence for teachers, since before this time there was very little regulation as to who could and could not be called a teacher.  It has nothing to do with turning our children into conformist robots, despite Seth's paranoid worries.

And how exactly did I know that?  I memorized it from a book, David Labaree's The Trouble with Ed Schools.  On that point, books (especially textbooks) are a valuable source of information; understanding and retaining the contents within them is a very useful skill, unless you think you can look up and learn everything you need to know on the fly.

As for his 8 educational reforms:

1. Get rid of live-lectures.  Why would you want this?  Is the same lecture sufficient for all students and classrooms? Shouldn't the lectures be personalized to the audience? Wouldn't it be better if they can ask questions during the lecture to clarify?  Wouldn't integrating work and feedback into the lecture be a good idea?

2. Never memorize anything.  Again, I think being able to retain information is an important skill, but maybe we are just designed to look things up over and over again without ever copying that information into larger schema and frameworks within our minds.

3. Getting rid of learning orders. Why? This seems essential as learning builds on learning so some subjects are needed as prerequisites for others.

4.  Precise focused education.  Good in theory, but difficult to do effectively and financially.  Plus, what about everybody listening to the same lectures and online curriculum he proposed earlier?

The other four are not any better.

Towards the end, he mentioned giving students an Arduino (a Raspberry Pi type device), and says something to the effect of, "Why can't we give these to kids and say, do something interesting, figure it out.  If you need help, ask questions."  How will this kid figure it out? By asking questions. By being taught by the teacher.  By reading the text. By collaborating with other students.  What if all the students have similar problems and questions?  Put them in a room together.  This is called a classroom, and it is how we teach students.

Next, unleashes a fusillade of misrepresentations with regard to how teachers act. "Do not figure it out.  Do not ask questions I do not know the answer to.  Do not look it up.  Do not vary from the curriculum.  And better, better, better, comply." Again, I have to believe that Seth has not been in a classroom in the last 40-50 years, as I have never had teachers who had these attitudes towards education.

He then mentions it is a myth that good parenting and academic success are predictors of career success and happiness.  As always, no evidence is cited.