I am guilty of assuming that the younger a person is, the easier it is to teach them digital literacy. Recent reports have found that - not surprisingly - digital natives who have mastered Fortnite and driving while texting are far from being able to distinguish between fake and real news, or adept at mid-skill job requirements for computer literacy. Add to that the fact that computer literacy is the #1 entry level skill that can make or break a new entrant’s chances in the job market, and the topic takes on a fresh weight and focus.
The eagle-eyed of my readers will have noticed that I garbled my phrases in the above intro: blending “computer literacy” and “digital literacy” at random. Fresh off our discussions about definitions, I have found that a great many sites and authors interchange and conflate “computer literacy” and “digital literacy”.
This is especially true when migrating from an education terrain into a work or HR terrain - where employers and their HR teams have not made the nuanced distinction between digital skills and digital literacy, that are made in the education sector.
While the teachers in my audience will probably know the difference, it does not help that the Common Core Standards are exceptionally vague about the type of digital skills students need to exhibit.
So, let’s do a Marie Kondo and declutter the concepts.
How computer literacy evolved into digital literacy
Computer Literacy/ Digital Skills (You can see where the confusion comes in) include:
- Able to create and share digital content
- Can find information on the internet
- Make online payments
- Set up an email account
- Construct a spreadsheet/document
- Load and interact with smart phone apps
- Can use online resources for basic device troubleshooting
- Installing and personalizing basic software
In the early 80s there was much hand-wringing among teachers and education policy-makers regarding how to teach computer literacy given that many teachers did not having sufficient training, and “micro-computers” were prohibitively expensive. Nonetheless, computer literacy was swiftly placed on the majority of school curricula.
As the digital dawn, dawned, and the internet ballooned (or blossomed depending on your point of view), great strides were made in the development, understanding and application of education technology and technology education. By 1993, $40 billion dollars had been invested in school IT infrastructure and pedagogues and bureaucrats scrambled to maintain a definition of computer literacy that mirrored the rapid changes in information technology. At the time computer literacy was in fact added to the “3 Rs”, as a required basic subject, more or less defined as: “To communicate, to locate and manage information, and to use these tools effectively to support learning the content of “the other basics.”
Enter the knowledge-based economy - a trending topic across sectors more or less from 2003 onward. At the same time The Partnership for 21st Century Skills first report (2002), advocated, “To cope with the demands of the 21st century, people need to know more than core subjects. They need to know how to use their knowledge and skills— by thinking critically, applying knowledge to new situations, analyzing information, comprehending new ideas, communicating, collaborating, solving problems, making decisions”.
As new challenges such as cognitive dissonance, fake news, digital addiction, cyberbullying and online predation emerged - students needed a more substantial arsenal of skills to navigate the digital terrain. Computer literacy was subsumed by the emerging need for digital literacy, alongside the more specific “digital citizenship”. The International Society for Technology in Education (ISTE) is the de facto arbiter of technology in education, and has worked to define and identity a finite list of attributes/skills that demonstrate digital literacy:
- Empowered Learner
- Digital Citizen
- Knowledge Constructor
- Innovative Designer
- Computational Thinker
- Creative Communicator
- Global Collaborator
An invisible skills gap?
I’ve taken a rather long, yet hopefully interesting route, in describing how computer literacy evolved into digital literacy. However, my contention is we have certainly lost something along the way.
In 2014 a study found that many college freshmen had insufficient computer skills. An astounding discovery considering many of them would have been digital natives. It was further discovered that after completing a computer literacy course, students exhibited significant improvement in their understanding of the course material. I find this fascinating. What was so critically missing from their presumably long years on the internet and enjoying K-12 digital literacy training that required such a rudimentary addition? In other studies, such as one in Australia in 2014, researchers found that of 2049 college freshman studied, 45% could be defined as having only “basic” digital skills.
An emerging idea is that the concept of “digital natives” is in fact a fallacy - and one that is creating a serious blind spot in what we think is appropriate workplace and college preparation, and what is actually required.
Join me next time as we explore how some schools are dialing back on digital literacy, and focusing on computer literacy instead.