If you’re an instructional designer or an educator with an interest in technology, you’ve probably heard someone use the term “digital native” to refer to young students who are innately tech savvy because they’ve been using the internet and digital technologies for as long as they can remember. You’ve probably also heard someone refer to today’s instructors—particularly older educators—as digital immigrants because they lack the same level of “fluency” in the technology skills, language, and culture that digital natives possess.
When Marc Prensky wrote “Digital Natives, Digital Immigrants” in 2001, he presented several spot-on observations about how some older instructors are unwilling or unable to embrace digital technology and culture in the same way that some immigrants never embrace the language and customs of a new country. While a few of his observations are lighthearted, he insists that the consequences of this trend are quite serious. To drive this point home, he claims that game-based learning can be used in all subject areas and implies that educators who reject this idea are dumb, lazy, and ineffective.
A frequent objection I hear from Digital Immigrant educators is “this approach is great for facts, but it wouldn’t work for my subject.” Nonsense. This is just rationalization and lack of imagination. In my talks I now include “thought experiments” where I invite professors and teachers to suggest a subject or topic, and I attempt—on the spot—to invent a game or other Digital Native method for learning it. Classical philosophy? Create a game in which the philosophers debate and the learners have to pick out what each would say. The Holocaust? Create a simulation where students role-play the meeting at Wannsee, or one where they can experience the true horror of the camps, as opposed to the films like Schindler’s List. It’s just dumb (and lazy) of educators—not to mention ineffective—to presume that (despite their traditions) the Digital Immigrant way is the only way to teach and that the Digital Natives’ “language” is not as capable as their own of encompassing any and every idea.
In his 2006 article “Listen to the Natives,” Prensky continues to emphasize that instructors must change their ways and place higher emphasis on engagement, stating, “As educators, we must take our cues from our students’ 21st-century innovations and behaviors, abandoning, in many cases, our own predigital instincts and comfort zones. Teachers must practice putting engagement before content when teaching.”
As someone who spent much of his childhood (and now a decent chunk of adulthood) playing video games, I love the idea of instructors integrating more games, simulations, and challenge-based learning activities into their courses. And there is mounting evidence that computer games can provide students with critical skills they need to succeed in the 21st-century job market. A 2006 Wired article, “You Play World of Warcraft? You’re Hired!” describes how management at Yahoo! considered a candidate’s achievements as a leader in the multiplayer game World of Warcraft to be an asset that set him apart from other applicants for a position as senior director of engineering operations.
Unfortunately, what educational-game-loving scholars fail to acknowledge is that even when we can prove that students have learned something more effectively and efficiently through game-based learning, we have to consider the return on investment. And by investment, I don’t mean the amount of time students have to spend playing the game in order to master a particular number of concepts or commit a certain number of facts to memory (although this should be evaluated as well). I’m referring to the amount of time and money it takes instructors, instructional designers, graphic artists, and programmers to develop educational games—or any multimedia learning resources for that matter.
Any game designer will tell you that even a high-budget, state-of-the-art video game will look dated within a few years of its release. Even the games featured on Prensky’s own company website, games2train, are showing their age. This isn’t necessarily an indicator that Prensky and his team are poor game designers. It just confirms that games often take a great deal of time and money to build and have a relatively short window of usefulness before they need to be updated or completely redesigned.
I think Prensky would argue that at the very least, instructors could do more to engage digital natives with low-tech games and simulations that increase learner engagement. His suggestions for games to teach philosophy or the Holocaust don’t necessarily require much more than a good set of role-playing instructions or a collection of powerful images from concentration camps and a provocative discussion prompt.
If the message was simply, “Let’s rethink the design of our assessments and learning activities so they’re more interactive and engaging,” I’d be all for it. However, what I often hear (and what I hear from the faculty I train) is that instructors feel pressured to make their course material as riveting and addictive as the bestselling video game du jour. That’s a lot to live up to, especially for a faculty member who, until recently, was feeling quite proud of herself for finally learning how to resize and crop a photo in PowerPoint.
I’m overjoyed when faculty come to me with grand visions for a multimedia game or simulation, but I know they often feel daunted when I tell them what they’ll need to contribute to the project. That’s why I typically brainstorm with them to find the most low-tech solution that meets their needs, then we build on that as time allows. I also like to look at their learning materials and ask a few questions to make sure we’re not putting the cart before the horse. Some of these questions include:
- Are the course materials broken down into manageable segments?
- Can students easily stop reading, listening, or watching and pick up where they left off later?
- Is it clear to students why they should read or watch each resource?
- Are resources prioritized? Is it clear which resources are the most important and which resources are optional?
- Will students know what terms to watch for or what questions to ask themselves as they go through the material?
- Are there ungraded knowledge checks to ensure students know if they’ve missed something?
- Do some assessments require application of the concepts? Are students asked to think critically about what they’ve learned?
- Do discussions encourage an exchange of diverse ideas and opinions? Or are students simply asked to regurgitate content from the resources and provide answers that will be repetitive and unoriginal?
This isn’t an exhaustive list, but I think it’s a good place to start. It might not generate the same buzz as turning a Holocaust lesson into a video game and accusing veteran professors of being lazy and behind the times, but at least we can rest assured that our priorities are in order and our courses are built on strong foundations. In addition, addressing fundamental course-design questions first and encouraging digital immigrants’ efforts does more than improve course quality. It provides digital immigrants with a starting point that feels welcoming and manageable—an Ellis Island of instructional design, if you will. It builds their confidence and encourages them to try new things. It replaces shame and guilt with pride and optimism.
We might not be able to completely transform an academic environment that can be hostile to digital immigrants, but we can strive to be better ambassadors of the digital culture we love. In the process, we can foster a melting pot of ideas and approaches to teaching that draws strength from diversity. And that’s the kind of immigration reform that benefits digital immigrants and digital natives alike.