20 Guiding Questions To Develop A Digital Literacy Plan

digital-literacy-plan-fic

20 Guiding Questions To Develop A Digital Literacy Plan

by TeachThought Staff

Literacy is a chief concern for both academic and professional progress.

Digital literacy is emerging as a genuine concern in education as technology competes with traditional texts for student attention. There have been recent revisions in academic standards, but these should be considered insufficient to address the rapidly changing literacy needs of students.

So we’ve put together some questions to help design a plan to respond on your own–and to do so based on effective and accessible data and measurement of student performance. While data can mislead and obscure–often tragically–literacy skills is one area where it’s hard to argue with. 

The consistent assessment and promotion of a student’s ability to consume and produce a variety of digital and non-digital texts is at the foundation of any school’s mission.

20 Questions As A Guide

The questions below are intended to act as a guide for someone setting out to create a literacy plan for a classroom, school, department, or district. It isn’t a framework or a model, nor does it offer answers, but rather questions so that you can By designing it on your own, the benefit should be increased authenticity, staff capacity, and potential buy-in for a plan.

It is included as an image at the bottom of the post.

1. What are the specific goals for our literacy plan?

Let’s start not with research and theory, but goals and data. One approach that has merit is to draft some goals without doing any research–just write them down without analyzing or worrying about what we’re missing.

Next, let’s look at some macro data–reading and writing score trends, reading data trends, etc., then more micro data–a grade level, a teacher, a demographic subset of students, the results of specific strategies, and so on. Then we can zoom back out–what takeaways are there? What does the data suggest? What does it insist? And more importantly, what data are we missing?

Goals needn’t have every single element of the SMART (Specific, Measurable, Assignable, Realistic, & Time-Bound) framework, but the more specific we can be, the better chance we have to meet those goals, and monitor progress along the way.

Then, how can we set goals for year 1, 2, 3, 4, and 5, and have a process in place to review progress and potentially revise goals annually (or even more frequently)?

2. How will we know we’re making adequate progress towards those goals?

And further, who should know about that progress (or lack thereof)?

3. How are digital and text-based literacy similar and dissimilar?

What does the research say? Who do leading thought leaders say? What do our teachers and students think?

4. How will we assess digital literacy?

Digital literacy is unique. Without traditional means of assessment, how do we respond? Is it important to assess digital literacy straight away?

5. How can we use data for each without drowning in it?

A plan hoping to produce data should use it as well–in fact, should be given birth to in a data-rich environment.

That said, absolutely no teaching or learning happens with data–data only predicts, reflects, and informs teaching and learning. Being data-aware and even data-driven are fine, but obsessing over data make a mess of things, creating a robotic and numbers-based plan that makes sense to no one but accountants.

So, what’s our plan?

6. How exactly do we define struggling reader and struggling writer?

How many data points does it make sense to use, and who should those definitions and subsequent labeling of students be shared with?

7. In a perfect world, what data would we have?

This can help inform future technology purchases, assessment design, and related school policies and procedures.

8. In our own school and district, what data do we actually have?

This asks that we confront the reality of where we are. What data do we have to identify struggling and gifted readers and writers, to monitor progress towards mastery, and plan instruction accordingly?

How/where is that data inadequate, and what other methods do we need to collect other critical data?

9. What tone should our plan have, and how do we create that tone?

How can we develop a plan with a general tone that builds on students’ strengths–that promotes strong literacy rather than correcting deficient literacy?

10. When should technology automate, and when should it personalize?

And what technology do we have that can help in each area?

11. What assessment pattern can we adopt to ensure a “best practice” mix of both academic and authentic writing pieces that yield quality data?

This includes quality and quantity-based components, including the formats and genres we plan on assessing by grade level; any relevant criteria–# of sources, citation styles, etc.; how frequently we plan on those assessments.

12. How can we seamlessly integrate assessment results into revision of planned curriculum and instruction?

This may be crucial to keep a “Literacy Plan” from devolving into a simple “Assessment Plan.” This will also enable differentiation of instruction at the classroom level. Traditional lessons and units don’t always yield the data necessary to make quick adjustments, nor are traditional teacher planning tools like units responsive in absorbing data along the way.

With that in mind, what should change?

13. What role can students play?

Beyond performing well on assessments? What substantive and meaningful role?

14. What is the relationship between digital reading and non-digital writing–and vice-versa?

How can one feed the other? What data do we have for each, and when we attempt to reconcile them–for example, On-Demand scores on average 35% lower than Reading scores on state assessment results? What does that tell us?

15. How can we support struggling writers early (as we do readers)?

Let’s consider everything from blogging and creative expression apps like Storehouse, to standards-based reporting, exemplars of work, precise rubrics, consistently high standards, and communication of relationship between writing/composition ability.

16. How can we use technology to support teachers?

How can we use technology to sharing graphic organizers, writing prompts, rubrics, writing strategies, exemplars of student work and other artifacts of best practice.

17. How can we align writing plan with other existing school programs?

And do so to minimize inertia and extra work while maximize efficiency and student achievement? This would include data team meetings, data reporting, grading and homework policies, staff meetings, 1:1 programs, tablets, response to intervention (RTI) classes, schedules, and other school-level components.

18. How can we better report student mastery to all stakeholders so that all stakeholders can contribute?

Here let’s consider all content area teachers, administrators, parents, and other schools for purpose of vertical alignment—especially early and consistently enough for that reporting to potentially make a difference to someone other than the classroom teacher.

19. What kind of data framework supports vertical alignment?

That is, what universal data form can we use district-wide–or at least school-wide–to pass along data packaged in a way that is immediately recognizable and useful without Herculean effort on our part?

20. How can we design the plan so that it cannot fail? 

How can we design the plan so that, if nothing else, we’ll see gains in two or three critical areas, even if the rest of the plan falls on its face because it’s too ambitious?

20 Guiding Questions To Develop A Digital Literacy Plan