Friday, June 10, 2011

It really would be different.

Trent Batson is a colleague who is always several steps ahead of me and I love his Campus Technology article, "10 Rules of Teaching in this Century". If we really read and understood and debated all the suggestions of this article, we would know pretty much everything we need to know about teaching with technology today.

It's a short, sweet piece, and I wonder if everyone who reads it really understands the revolutionary suggestions he is making. Right off the bat in his suggestion #1, Trent advocates "Don't just tell students the key knowledge in your field, but help them discover it through problem-based active learning." This is something lots of us feel that we do, but we're actually a skewed angle away from the reality of what Trent's suggesting.

The idea with problem-based learning is to actually replace all the class time normally spent on lectures with problem-solving activities. This is a hard one for many faculty to grasp. I have talked to faculty who say that they really must explain the material in class, otherwise the student won't really understand it the way they have to for future lessons/problems. Often the same faculty also say that students don't really apply what they've "learned" - which is really what they've heard - to the later activities in the class.

These are related problems. We must at least seriously consider giving up the model of education where we tell the student something, they absorb and remember it, and then later on they apply it. There's strong evidence that this model never really worked that well. There's much stronger evidence that in fact the learning that stays with the student is the learning they do on their own. When I ask faculty when they really "got" a key concept in their field, it was usually when they were doing the work - conducting an experiment, synthesizing research and writing up their ideas or findings, creating a piece of art for themselves, talking to a live interview subject or creating a map for themselves - it was when they were doing the work that is the work of their field. Sometimes that "aha moment" occurs early on in their educational careers, and such "aha moments" are why we value higher education where students have access to great professors, great learning facilities, small classes, field trips, and other experiences where those moments tend to happen. Sometimes, sadly, that "aha moment" doesn't even happen until graduate school, at some point where they were doing the work of their field for the first time and really caught the excitement of it because they saw that there were real questions to answer and how to tackle answering them.

It's not actually idealist to try to move that "aha moment" back all the way to the first year of college. We must give up any idea we ever had, if we had it, that first year students arrive on our campuses as blank slates ready to be filled with higher knowledge. In fact they've already had long educational careers, for better or worse, and we can ask them to build on those careers immediately by putting them to work doing the work that is the bread and butter of the field. It actually is not untoward to give them a reading assignment and then ask them to do some academic work based on what they read rather than have us explicate it for them.

No student is immediately going to intuit how to develop a thesis statement, how to evaluate a definite integral, how to titrate an acid/base solution, or how to conjugate adjectives in Japanese. But great explanations of these tasks can actually be pretty brief, and we can often re-use great multimedia materials made by others for these basics. Instead of devoting 80-90% of the semester to explanation and 10% to problem-solving, we can flip it. Spend 10% on explanation (again, using technology to help where we can) and 90% on problem-solving. Let the students get stuck, let the students fail, and then unstick them, show them how to improve. As students they really don't get why or how to improve until they've failed, and we need to give them a lot more time to do that.

Imagine a class where instead of "covering" the material for a semester and then giving the student a final paper or exam to see if they "got it", students created work - research, experiments, writing, bibliographies, presentations, movies, lessons, summaries - from the very first day. Imagine that our job as faculty was to help direct them in doing the work - "Here's how to find the resource you need; here's how to complete that task" - in a manageable way (video or audio or text once for all, not one-on-one assistance for one at a time). Imagine that they turned in the work - and it wasn't very good! And we said so! And then they did it again! And again! Imagine that by the third or fourth go-round, students were really starting to understand what it was that one did when one does math, or science, or literature, or art, or management, or film, or teaching, or any of the things we teach our students to do.

Imagine how much more they would understand the material, and how much of it they would own and take with them from the class, if we taught the class that way.

That's really the type of model Trent is suggesting, and I think he's right. We all know that you never understand anything so well as when you explain it to someone else (in your writing, your presentation, your portfolio, your anything), but previously it wasn't practical, we thought, to do something like that with every student in a class.

But that's exactly the type of thing technology really can help with. If I have 30 students in a class and I'm determined to teach this way, technology lets me share all sorts of resources with them beyond the book. Technology lets them collaborate and raise questions before class so you can answer them in class. Technology lets you do frequent low-stakes quizzes or papers so that they can just get credit for completing them and you can see who's not keeping up, with very little time and effort during the semester. Technology makes it trivial for you to record a ten-minute explanation of "Here's where most of you are going astray" as a video on a Thursday, distribute it to all your students, and have them back in class on the Monday with the next set of questions based on what you've just explained and then what they tried/thought/did.

When done well, it doesn't look like a traditional class and some faculty don't feel like they're doing their job if they don't "cover" the material - by which they mean lecture, however much the lecture might be "enhanced" with multimedia materials. But I guarantee that the students who leave such a class, even if less material has been "covered", understand more of what has been addressed during the course of the class and are more likely to retain it. When they've applied it over and over and over and over again in your class, they sure as heck are going to be (at least more likely to be) able to apply it again the following semester.

To tackle it one more way: Our current model presupposes that we tell the student everything they need to know (yes, in addition to other activities, but really the bulk of class time is spent with us telling them what they need to know), and then at the end of the semester that they prove that they know it by doing something - maybe a final exam (though the same could be true of papers, presentations, or any other type of "capstone" assessment). Faculty often cringe at the idea of students just doing the thing first they were supposed to do last, and doing it badly. Why? What could possibly demonstrate better to students that they don't know what they came to class to master? I know some creative faculty who give the final on the first day of class. One may reasonably expect everyone to fail. What if the teacher then gave the student direction on how to derive, research, create, find one or more of the answers, and then gave a similar test again? What if they took a "final" five times through the semester, each time figuring out more of how to do the work that a "final" represents?

What grade do you think they would get the last time they took that test?

And if you were the student, which type of work would you rather do?

Thursday, June 2, 2011

Education and digital citizenship

For the last few Boot Camps I've made a passing mention of "digital citizenship" just to ask the question of whether or not we are addressing it sufficiently in higher education. The example I always use is to ask the question of whether or not our students are prepared to vote on questions having to do with electronic voting machines. It's a question as pertinent to the operation of democracy in our country as any in these times, and yet I can't seem to sell it as a pressing educational topic.

Yesterday I was surprised to hear an entire issue of the news program "On Point" address questions of cyberattacks and possible military responses. A little Googling and I found out at least one of the reasons why this topic bubbled to a head yesterday: Lockheed Martin suffered from a cyber-attack, and obviously that is a defense contractor of the first order. While several outlets are reporting that the attackers came up empty, others are also reporting that nonetheless the U.S. is quickly moving to consider cyber-attacks "acts of war" and deciding how and when to respond and in what fashion.

There is no clearer example of what it means to be a citizen of a 21st century nation.

And on NPR, where the story I first heard is available as a podcast, the very first of the 46 comments says "How can anyone listen to this program and have any trust in our elections where the vote counting is done in secret on electronic voting machines?"

There are a number of topics that comprise "Digital Citizenship" and hopefully some of them, at least, are going to be addressed in K-12 educational environments going forward. But are we sure we're graduating students who are at least aware these topics exist? And if we're not, how can we possibly integrate these issues into a curriculum that's already overloaded in our limited time with our students?

Some of you know what I'm going to say: we are going to have to give up something to get something else. It may very well be that we need to cut one lesson or unit from our syllabus to tackle digital citizenship questions somewhere in that same syllabus. But I suspect that we're already teaching a lot of topics that touch on these issues. It's not hard (using this particular topic list) to imagine the writing class that at least mentions digital literacy, the political science class that touches on digital rights and responsibilities or digital access (Arab Spring, anyone?), the legal studies class that addresses digital etiquette, or the economics or business class that includes digital communication as well as digital commerce. It may also be that we just need to consciously highlight those places in our curriculum where we're discussing what are for our faculty often very new topics, and make it clear that there are many open questions to be addressed.

And while I never advocate doing an exercise using electronic tools only once, it may be that doing one exercise in the course of the class on a digital citizenship topic is enough. Somewhere in our various curricula we do need to at least address these various topics.