Except for a brief period of time when I was learning how to walk, I’ve almost always had access to some sort of commercial computing device (the first I had played with was the Commodore 64). Since then, I have generally taken each new innovation for granted while dabbling in related activities (e.g. web design, programming) that interested me. Then I heard a funny term when I started graduate school that supposedly describes my generation: “Digital Native”. And it conjured up images of Tron.
Since then, I have read numerous research articles and at least one book on the topic, but my initial opinion regarding the term hasn’t really changed—it was actually affirmed. Like some other scholars, it’s not one that I particularly like. I think it’s misleading for those who haven’t done their homework. Digital natives aren’t (and scholars would argue that this isn’t their claim) all programmers or IT people, nor are they heavy technology users that are all comfortable with it; the lucky ones (if I dare define without citing my sources) are people who have grown up with newer technologies and are used to rapid innovation, and the unlucky ones face a huge disadvantage if they don’t get access before college. Yet the term and mis-perceptions still persist.
What prompted me to address this is this past Tuesday’s ProfHacker article in The Chronicle of Higher Education. The author and some subsequent reactions seem surprised that students weren’t all gung-ho about using the iPad. This particular professor prefaced the results with the following:
Nearly all of these students fit the profile of what Marc Prensky calls “digital natives,” those who have used digital tools their entire lives. They rely on text messages to communicate. They carry laptop computers. When they research, they go online. They subscribe to Netflix or use torrents, and listen to downloaded music. Multitasking comes naturally.
This seems to imply that they’ll take to the iPad easily as well, which I think demonstrates a fundamental misunderstanding of technology (though he does address the pitfalls later). Don’t get me wrong: I don’t think there’s anything wrong with trying new technologies, but I would hesitate to make broad assumptions about its use or adoption. It isn’t a panacea for lack of engagement or a solution in of itself to increase engagement. It takes purposeful planning, iterative evaluation, and continuous development.
I’m reminded of the NY Times article regarding a seemingly failed infusion of technology into the Kyrene School District (which coincidentally was where I did my public schooling). Reading and math scores didn’t improve after money was poured into creating a 21st century classroom without any specific plan, and so some have leaped to the conclusion that it’s a waste of money or ineffective. While it’s not my goal to argue this particular point, it doesn’t take a lot of thought to figure out why that reasoning is faulty.
College campuses are eager to jump onto the next big thing to respond to the needs of the current generation. In practice, those efforts end up being clunky or ultimately failing. We’re all very good at adapting it and talking about it, but practical application and widespread adoption by patrons leaves a little to be desired. We’ve become more of our own target audience than the ones we think we’re serving.
But responding and being engaged with the digital native is about understanding the ways in which today’s technology is being used (like the most recent study in Project Information Literacy has attempted) and the value it provides (unless you know for sure you’re a trendsetter). It’s also about knowing that iPads and other devices are not exotic machinery that’s only accessible by my generation or the only way to reach us–just as VCRs and pagers did not define communication for the previous generation.
I’ve only painted part of the picture, but what do you think? Is the idea or characterization of the “digital native” something you do, study, or find useful?