American Culture

Educating the 21st Century cyberstudent…or not?

Don Tapscott has some radical new ideas about education. Here’s a sampling (as related by ReadWriteWeb):

  • “…the age of learning through the memorization of facts and figures is coming to an end. Instead, students should be taught to think creatively and better understand the knowledge that’s available online.”
  • “…Google, Wikipedia, and other online libraries means that rote memorization is no longer a necessary part of education.”
  • “Teachers are no longer the fountain of knowledge; the internet is…”
  • “Kids should learn about history to understand the world and why things are the way they are. But they don’t need to know all the dates. It is enough that they know about the Battle of Hastings, without having to memorize that it was in 1066. They can look that up and position it in history with a click on Google.”

(These last two are quotes directly from Tapscott, by the way, and I need to go pick up this book. It seems awfully interesting – but for now the RWW report will have to do.)

That one item – “Teachers are no longer the fountain of knowledge; the internet is…” – is among the most terrifying concepts I’ve ever run across, by the way. I don’t know exactly how he intends us to understand the pronouncement, but the Internet is not a fountain of knowledge, at least not in the absence of strong thinking skills. It’s a firehose of data, to be sure, but as I’ve noted before, data isn’t quite information, information isn’t knowledge, and knowledge isn’t wisdom. More on this later.

Way, way back in 1989 my fellow scrogue, Dr. Jim Booth, and I did a series of seminars where we actually argued that these kinds of changes were already happening (the essay linked here was updated slightly in the mid-’90s to account for the emergence of the early Internet). At that point we weren’t talking about the Net so much, of course, but were mostly focused on how the socializing process of television was altering the function and utility of the human brain.

Thanks to television and instantaneous global communications, thanks to the electronic data base, to the video game system, word processor, hand-held calculator, digital synthesizer, computer billboard and infonet – thanks to a boggling array of modern and post-modern amusements and conveniences, humans have evolved, perhaps more rapidly and more dramatically than at any time in our history.

From a traditional perspective, we simply don’t know all the things we’re supposed to know. A number of writers and researchers have argued, quite persuasively, that American students are impoverished in basic geography, history, literature, and math skills.

However, while Jane can’t perform long division, she is pretty handy with a calculator. Maybe Johnny can’t spell, but his word processor, like mine, has a built-in spell-checker. And while Danny is probably beyond hope, Jimmy knows exactly where to go to find out all he needs to know about Mexico – especially if his computer is on-line with an interactive infonet like The Source or CompuServe.

We went so far as to argue that the moment we were in – or more accurately are in – represented a critical leap ahead in human evolution.

A cursory glance at the Geologic Timetable in Webster’s Dictionary reveals that major evolutionary and anthropological events often parallel significant geological shifts. The first evidence of humanity, for example, roughly coincides with the onset of the Quaternary Period some two million years ago. A Wake Forest University Anthropology professor we consulted recently pointed out certain major changes in human living patterns at the beginning of the Holocene Epoch – the “recent,” or post-glacial period.

It isn’t at all unreasonable to wonder whether we are in the midst of what geologists 10,000 years from now might see as the transition from Holocene to whatever comes next. The difference between the dawn of this epoch and all others before it, though, is that this time it will be engineered. The environmental changes which loom now are the exclusive product of human technology.

For the heck of it, we termed this transition from human to posthuman the “cyberlithic.”

There’s no question, as Tapscott believes, that our brains are being re-wired – Jim and I made that point in these seminars, too – but I wonder how hellish the cost may prove to be. As RWW notes:

Today’s students are growing up in a world where multi-tasking has them completely immersed in digital experiences. They text and surf the net while listening to music and updating their Facebook page. This “continuous partial attention” and its impacts on our brains is a much-discussed topic these days in educational circles. Are we driving distracted or have our brains adapted to the incoming stimuli?

I know that much has been made of the digital generation’s ability to multi-task, for instance, but I have yet to see any evidence that doesn’t make clear how doing several things at once reduces overall efficiency.

Dr. Gary Small, a researcher at UCLA, is also examining how daily use of digital tech re-wires the brain. His particular concern has to do with the erosion of social skills.

When the brain spends more time on technology-related tasks and less time exposed to other people, it drifts away from fundamental social skills like reading facial expressions during conversation, Small asserts.

So brain circuits involved in face-to-face contact can become weaker, he suggests. That may lead to social awkwardness, an inability to interpret nonverbal messages, isolation and less interest in traditional classroom learning.

Small says the effect is strongest in so-called digital natives — people in their teens and 20s who have been “digitally hard-wired since toddlerhood.” He thinks it’s important to help the digital natives improve their social skills and older people — digital immigrants — improve their technology skills.

(Of course, Small’s brainiac theories are thoroughly refuted by “at least one 19-year-old Internet enthusiast” who “lives near Pasadena” and “spends six to 12 hours online a day.” This, though, is probably something that should wait until my next missive on the sorry state of science reporting in America.)

The “brain as computer” model that Jim and I discuss only works if education does a good job of developing the processing and search functions. Yes, all the data is online, or soon will be. So maybe it’s not critical that we have it all memorized. But, are we capable of finding what we need quickly and efficiently? Are we adept at sorting information from disinformation? And most importantly, are we able to think critically about the data we retrieve?

All the evidence I see around me says we’re failing on all fronts. A former colleague, an incredibly accomplished man who these days teaches undergrads for a living, once observed something to the effect that “once they get past downloading music, IMing their friends and surfing porn, these kids are helpless with computers.” Maybe he exaggerates a little for emphasis, but my experience (and volumes of research supporting it) say that the techspertise of “today’s youth” is overrated. Their lives are dominated by electronic technology, to be sure, and they can become fluid end-users, but you have to be careful about using the word “savvy.”

On top of this, the Millennial Generation has been trained to be very good at short-term tasks with readily identifiable objectives. This has come at the expense of teaching them abstraction, critical evaluation and problem solving skills. They’re far better in teams than my generation (which is nearly feral in its individualistic approach), and this mitigates the problem significantly when they’re allowed to work in groups, but still, the premise that the ubiquitous availability of every scrap of information in the world somehow renders obsolete old ways of knowing and learning is … suspect. At best.

In some ways it’s nice to reflect, after being so wrong about so many things in my life, on something that I was part of getting right a long time ago. Still, seeing the early emergence of an important (and perhaps obvious) trend is hardly the same thing as having a robust solution for all the problems that will result. The fact is that we do, absolutely, have new tools at our disposal that can enable us to dramatically improve the sum total of what is known. The Net can help us generate new data, and as I suggested above, with the proper kind of education we can develop that data into information, and from there transform information into knowledge and eventually sift wisdom from the knowledge.

But the machines won’t do it by themselves. It’s up to us to craft the policies and processes that turn the machinery to the uses we want and need, and we have barely taken the first step down that road…

14 replies »

  1. Interesting to think about. My 3rd grader is learning how to type, do internet research, and put together a Powerpoint presentation this year. And while last year was all about rote memorization, this year has been much more about the WHY than about the WHAT. Instead of memorizing multiplication tables, she’s being taught to think about things in groups–so 5 x 3 is actually 5 groups of 3, or 3 groups of 5. This means that she has a way to figure out harder problems, instead of just having to parrot out facts.

    I must say I’m enjoying this year much more than last.

  2. The geologic term for what scientists will call this era (if they ever agree that the Holocene is over, anyway), is the Anthropocene.

    It’s also one thing to know how to use the tools, it’s another to know how to design and build the tools.

  3. “…the age of learning through the memorization of facts and figures is coming to an end…

    Memorization has never been synonymous with learning; no one with any meaningful educational experience would posit that it ever had been.

    “Teachers are no longer the fountain of knowledge; the internet is…”

    Again, a completely mistaken idea of the role of teachers; maybe Don should Google “Socrates.”

    It is enough that they know about the Battle of Hastings, without having to memorize that it was in 1066. They can look that up and position it in history with a click on Google.
    “…Google, Wikipedia, and other online libraries means that rote memorization is no longer a necessary part of education.”

    Right. Books already did that, too. Contextualized learning is not a new concept.

    What a perfect example of precisely the kind of lazy, uninformed, shallow thinking that the Internet may make easier… but certainly didn’t cause.

    Next time, just tell me what you, Denny and Jim think about it and leave out the talking head.

  4. I have this nagging feeling about a ‘forest for the trees’ phenomenon whenever I hear the word “generation.”

    There is no “generation.” And all these things apply to all sorts of age groups here, today. Grandmothers are online. Old nerds are programming all day.

    People like to group people into “generations” but the problem is the generalizations. No, they aren’t all doing this, or that. They aren’t all responding in so and so way. This is, to me, a shortcut to thinking, and it is not revealing.

    I don’t like group think, nor generalizing about groups.

    As for the reliability of information on the Internet, prior to that we had word of mouth, scrolls, pamphlets, magazines and books. All of which had lots of misinformation. It was always the sharper people who could tell the difference and critique the lamer arguments. The Internet is a plus in so many ways, not least of which is the time it takes to find information. You don’t have to drive to a library, search cards, try and find a book, try and see if your initial question is even addressed in this book in a meaningful way… etc.

    All that generalized muckity muck thrown at the internet denies the strengths that the internet brings. The pluses far outweigh the minuses. What we need is more focus on critically analyzing the data there, rather than a rejection of this most powerful tool.

    People (of EVERY generation) are lazy and surface thinkers for the most part. They want the path of least resistance. If that means settling for whatever your friend told you is true, whether face to face, over the phone, in a text message, in an email, or on a web page — rather than investigating more credible sources — then most people will take the questionable source over the more informed source. It takes work to track sources and analyze them, and at some point there is a diminishing return. Again, nothing whatsoever to do with generations, and everything to do with basic human nature and time constraints.

    I’ve spent several years tracking down sources and analyzing who is credible in regards to the 9/11 attacks, terrorism, Iraq and government crimes. So I do have some experience in thinking about these matters.

  5. First John tells us that “I don’t like group think, nor generalizing about groups.” Then he tells us that “People (of EVERY generation) are lazy and surface thinkers for the most part.”

    This is some seriously funny stuff. Keep ’em coming.

  6. I’m not sure “solution” is the right word. We need to adapt faster, not the students. This stuff will continue to evolve regardless of what we do. I’m not even sure we can guide this particular evolution. Technology is evolving too rapidly. By the time any high end policy is in place, it is irrelevant. Hell, the smart cell phone I bought last year is already out of date. At best, we might figure out how to adapt the fundamentals of learning to this firehose of data. Figure out how to teach critical thinking in such a way that it involves the internet.

    Take PubMed, for instance. Students have access to a HUGE scientific data base. There’s no way I can possibly keep up with everything my students are doing and tracking. All I can do is get them to explain their thought process, point out when their thought process might be illogical, and force them to rethink an idea. I can do this one-on-one easily enough. But how you do that on a mass scale, I have no idea. That’s actually one of the things that scares me about standing in front of a classroom full of budding scientists. 🙂

  7. I second Ann’s rant (except for the part about not telling us what Tapscott thinks, because i fear that he’s the type who will be listened to).

  8. I’m going to be contrary, typical of blog posters. I don’t like how I’m getting tied to the web. I find myself so incredibly impatient with real people and natural processes. Yes, it’s changing us but I don’t think it’s necessarily something we want.

    Also, I think it’s good to memorize certain things–not only rote memorization, but yeah, some stuff needs should be. Yeah, kids need to understand 5 x 3 is five groups of three–I have kids, and I’m a scientist by training. I get the concepts. But people also need to be able to just know that 5×3=15 without having to think about it. It should be automatic.

    And dates? It’s important. The lack of understanding of the real time involved with history and also the real amounts of things adds to numerical and mathematical illiteracy in this country. People have lost a sense of scale. 700-million dollar bailout version 700 billion dollar bailout? How many don’t even get this anymore?

  9. Thank you, Beth. I am neither a scientist nor a mathematician, but I use those crazy multiplication tables (and fractions and long division and pi and the like) every single day, and I still remember my second-grade teacher, Mrs. Noyes, leading us in recitations. I also remember the math relay races – in line, up to the board, solve the problem, run back, next!

    She’s so right. Some things you just need to know without thinking about them.

  10. All right, I’ll clarify:

    “First John tells us that “I don’t like group think, nor generalizing about groups.” Then he tells us that “People (of EVERY generation) are lazy and surface thinkers for the most part.”

    This is some seriously funny stuff. Keep ‘em coming.”

    The “generation” distinction, grouping people by where their birth date falls is too arbitrary to be meaningful. I find it more pseudo-science than real science. Should I keep it coming?

    Observations that hold true for everyone are more reliable, than arbitrary groups defined by someone who hopes to sell a book or article based on phony links, like a birth date range.

    There will be a bell curve of the laziest thinkers to the most extreme and obsessed. This holds true of all generations, all societies. The shape of the curve may vary, but it’s going to be heavy toward the “lazy” end of the spectrum.

    Perhaps with the speed and ease of technology, this is fundamentally altering the curve in some way that we probably can’t quantify yet, living through it. But, it’s NOT a “generational” phenomenon, and more a state of the progress of technology phenomenon.

    Now, are you going to respond meaningfully, or are we done?

  11. The “generation” distinction, grouping people by where their birth date falls is too arbitrary to be meaningful. I find it more pseudo-science than real science. Should I keep it coming?

    Actually, you should read Howe and Strauss’s books. They’re pretty much premised on NOT categorizing exclusively by birth dates.

    We invite commentary. But when you dismiss X because they should have done Y, when in fact they DID do Y, you don’t do your credibility any favors.

    Just saying….

Leave us a reply. All replies are moderated according to our Comment Policy (see "About S&R")

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s