I recently had an interesting discussion with an online acquaintance. We were discussing a tagging project she’s doing for a thriving Live Journal community I
referee moderate. She found the project to be very productive and educational, and I was happy to let her use her considerable skills as an organizer and archivist to clean up the hundreds of tags the community has generated in its four years of existence. She told me that I seemed to be very comfortable operating in the digital world and that my work-style and behavior was that of a ‘digital native’. She then asked me my age, and was surprised when I told her. I’m 46- a ‘cusp’ of the Baby Boom and Generation X. She’s almost thirty- a genuine ‘Generation Y’.
Why the surprise? And why the significance of ages? It appears that folks of both Gen-X and the earlier Baby Boom generation are considered by certain scholars to be ‘digital immigrants’ while people of her generation, who were born in the late 70s and beyond- are considered ‘digital natives’ because they’ve grown up with all the wonderful digital toys we now take for granted. I had Merlin, she had Nintendo. But she was the person who introduced me to the idea of ‘digital natives’- and she paid me the compliment that my online behavior was like that of a ‘native’.
Of course, I was intrigued, so I went digging, as I tend to do when introduced to a new idea. There are lots of interesting articles about the cognitive and methodical differences between ‘natives’ and ‘immigrants’ and how they apparently have a hard time speaking to each other. The “immigrants” come out looking like clueless, often wrinkly lamers. What gets forgotten in all the labeling is that folks of my generation helped to create, build, test, and perfect all the lovely toys that the ‘natives’ take for granted today.
Let’s take the internet and computers, for two teensy little examples. Once upon a time, back in the ancient reaches of the nearly-forgotten Cold War, somebody got the brilliant idea of creating a means for the military and research labs at major universities to communicate even if there was a global nuclear disaster. This network was called ARPANET: Advanced Research Policy Agency NETwork. It was the launchpad of the robust packet-switching technology that we take for granted today.
I was a user on that network. I was on the Internet before it was the Internet (and had email before it was called email), when you had to know the Unix command-line interface to get stuff done, and where there was no point-and-click anything. It was just an amber screen, ticker tape with ASCII code, fairly long waits for replies, primitive email only over teletype circuits, and lots of geeks -both civilian and military- sharing information. We were in heaven- we got to play with state of the art goodies, and experimental stuff from DEC and Apple and Motorola and other Silicon Valley denizens. We got prototype gear hot off the lab benches of Stanford and other places. We were in the ‘delivery room’ of the Internet, and we were the first to use it. It was part of our jobs.
We had no idea that our private little electronic clubhouse would go public- no one did. But the cold war ended, and the nascent Internet was one of those cold war assets that got re-purposed. But then MOSAIC was created, Prodigy and Compuserve got people to talking, dial up BBS started disappearing- then the first spam showed up on USENET. We knew then that our days of relative safety and bliss were numbered, but we went on building primitive web pages anyway. I could not imagine living without the Internet today.
Same thing with computers. I started playing with coffee-table sized VAX machines with high speed printers the size of Volkswagens, which would make a beastial roar when they started up. I mounted tapes, loaded Hollerith punch cards, read ASCII punched paper tape, and troubleshot the system in the data processing shop. And I had time to play with the cantankerous new ‘micro-computers’ that the USAF was introducing into their communication and maintenance shops. “Use this to log your work,” our commander told us, and took away the Selectric. I became the computer expert, often butting heads with my boss, who thought that the machine was exclusively his to use. I ‘won’ when it broke and I fixed it. Soon after, I got my own personal computer, and really started digging into it.
But I digress. These scholarly papers I referred to earlier seem to be awfully ‘binary’ in their approach to the generational differences between types of users. Here is some of what they say:
Todayâ€™s students – K through college – represent the first generations to grow up with this new technology. They have spent their entire lives surrounded by and using computers, videogames, digital music players, video cams, cell phones, and all the other toys and tools of the digital age. Today’s average college grads have spent less than 5,000 hours of their lives reading, but over 10,000 hours playing video games (not to mention 20,000 hours watching TV). Computer games, email, the Internet, cell phones and instant messaging are integral parts of their lives.
It is now clear that as a result of this ubiquitous environment and the sheer volume of their interaction with it, today’s students think and process information fundamentally differently from their predecessors [emphasis original]. These differences go far further and deeper than most educators suspect or realize. “Different kinds of experiences lead to different brain structures, “says Dr. Bruce D. Berry of Baylor College of Medicine. As we shall see in the next installment, it is very likely that our studentsâ€™ brains have physically changed – and are different from ours – as a result of how they grew up. But whether or not this is literally true, we can say with certainty that their thinking patterns have changed. I will get to how they have changed in a minute.
What should we call these “new” students of today? Some refer to them as the N-[for Net]-gen or D-[for digital]-gen. But the most useful designation I have found for them is Digital Natives. Our students today are all “native speakers” of the digital language of computers, video games and the Internet.
So what does that make the rest of us? Those of us who were not born into the digital world but have, at some later point in our lives, become fascinated by and adopted many or most aspects of the new technology are, and always will be compared to them, Digital Immigrants.
The importance of the distinction is this: As Digital Immigrants learn – like all immigrants, some better than others – to adapt to their environment, they always retain, to some degree, their “accent,” that is, their foot in the past. [What is wrong with that? -ed] The “digital immigrant accent” can be seen in such things as turning to the Internet for information second rather than first, or in reading the manual for a program rather than assuming that the program itself will teach us to use it. [What’s wrong with RTFM? That’s how I learned my trade! -ed] Today’s older folk were “socialized” differently from their kids, and are now in the process of learning a new language. And a language learned later in life, scientists tell us, goes into a different part of the brain.
There are hundreds of examples of the digital immigrant accent. They include printing out your email (or having your secretary print it out for you – an even “thicker” accent); needing to print out a document written on the computer in order to edit it (rather than just editing on the screen); and bringing people physically into your office to see an interesting web site (rather than just sending them the URL). I’m sure you can think of one or two examples of your own without much effort. My own favorite example is the “Did you get my email?” phone call. Those of us who are Digital Immigrants can, and should, laugh at ourselves and our “accent.”
Hey, I’m guilty of printing out documents to annotate them and read from them, and making the ‘check your damn email!’ call. Does that make me an ‘immigrant’ or just a multitasker?
But the point is that these writers and researchers totally ignore or overlook people like me who were the earliest users and adapters, and are ourselves ‘natives’. Or more properly, we’re pioneers, since we’re the ones who built, tested, and worked the bugs out of many of these things. We were the people in the university computer labs, or in the military communications shops, who put this technology to real-world use, and, when we could, started bragging about it to our outside friends. (Some of it was classified. Some of it still is.) They wrote the manuals, we were the ‘field testers’. It was a heady time.
For the longest time, my new career and profession was met with deep skepticism by my father. “When are you going to get a real job?” he’d ask me. He thought that computers were just another fad like CB radio was in the seventies. He got into that big time, and was disappointed when it faded. He stopped following technology just as I joined the USAF.
But I’m a digital pioneer, and while they might still consider me an ‘immigrant’, I disagree. Here’s an interesting comparison between ‘natives’ and ‘immigrants’. Scroll down to about the middle for the comparison. Which are you?
Timothy Van Slyke has a more balanced and skeptical view of the idea that our very brain structures have been changed by technology:
I find it hard to believe that neurological structures could change to such a dramatic extent from one generation to the next. Yet even if we grant that digital natives think and learn somewhat differently than older generations, we may be doing them a disservice to de-emphasize “legacy” content such as reading, writing, and logical thinking, or to say that the methodologies we have used in the past are no longer relevant. For example, as a technology instructor of pre-service teachers, I found that while most of the younger students were proficient in using the Web, they could not adequately perform advanced searches or evaluate the validity of the resources they found. Digital immigrants and natives alike are bombarded with vast volumes of information in today’s electronic society, which, in my opinion, calls for an even greater emphasis on critical thinking and research skills? -the very sort of “legacy” content that teachers have focused on since classical times.
The Internet, being a primary medium of this emerging culture, is certainly not something that we in education can ignore. Non-native educators will need to learn to incorporate the Internet into their teaching because, as Prensky notes, that is the first place the digital natives will go for information. But before we discard all of our digital immigrant notions of teaching and learning, and before we turn to video games and simulations as the primary modes of instruction, we should answer a number of questions.
First among these is whether all of today’s students fit Prensky’s definition of digital natives. Are all students, for example, exposed to information technology and video games to the same extent? What are the demographic differences?
I currently am living with my family in Hungary, raising two bicultural children. From this perspective, I take issue with a number of Prensky’s assertions about immigrants and cultural assimilation. It seems to me that Prensky overemphasizes the differences between his two groups and de-emphasizes the similarities. While it appears that the digital natives, on average, grew up reading less and engaging with digital media more, this does not mean that they are illiterate or unresponsive to traditional forms of teaching and learning. Like many observers of other cultures, Prensky overgeneralizes his description of the digital native and then draws dramatic conclusions from those generalizations. He states, for example, that “Kids born into any new culture learn the new language easily, and forcefully resist using the old” (2001a, p. 4)â€”an assertion that, in my experience at least, is completely unfounded. My own children are living examples of young people who have no problem functioning in two cultures: They can easily speak Hungarian or English depending on their environment, for example. Moreover, many immigrant youths who do fully assimilate into the new culture later regret the loss of connection to their parents’ background (Skerry, 2000). Cultural assimilation rarely entails a wholesale abandonment of previous customs or practices; rather, it typically involves a flexible process of negotiation and adaptation, wherein certain elements of both cultures are retained in a new combination with one another.
I like the way this guy thinks. As in anything, it’s preferable to look for common ground in order to congenially exploit and exalt the differences. Teaching our ‘digital natives’ their true legacy is one thing that we must do, lest they become totally forgetful of it. I did desktop publishing with those rub-on fonts before I got “Print Shop Deluxe”, and that instilled in me an understanding of the art of publishing. I hand wrote my essays and journals before I had a computer, and honed my craft over hundreds of hours of writing letters and journals. The substance is what makes the surface, and that is what we must teach our descendants. We can breach any divide – digital or generational- with deeper understanding.
And we digital pioneers can tell our kids the 21st Century version of the ‘when I was your age’ story. I call them ‘BC’ stories: before computers (or cable).