September 8, 2016

Starting the conversation about connected reading

This semester in my undergraduate content-area literacy courses, I am piloting ideas from Turner and Hick’s (2015) Connected Reading: Teaching Adolescent Readers in a Digital World

Connected reading "is a model that situates individual readers within a broader reading community and acknowledges a variety of textual forms, both digital and print” (p. 5). Connected reading is NOT the same as online reading because the emphasis is on digital practices, not digital text. 

Turner and Hicks explained, “In short, we see that readers are connected to one another in increasingly useful ways and that they make meaning of what they read in various ways through their connections” (pp. 5-6). 

Turner and Hicks' book is based on research conducted in 2013 with 12 middle and high school teachers and several hundreds of students from across the country. At the end of their book, they summarize several implications for teaching and learning, beginning, of course, with teacher learning: “A shift in thinking about the nature of reading requires that educators talk with one another about the practices of real-world readers, not just the limited skills that are tested or the standards that can be linked to instruction” (p. 139).

I interpret this as further evidence that new literacies education must be integrated in teacher preparation programs. 


I already survey my students’ attitudes and predispositions toward digital technology, and I require them to sample several new digital tasks, texts, and tools each semester. What’s different this semester is I am trying to take a more deliberate and stepwise approach by implementing some of the same lessons and tasks that Turner and Hicks used with their participants, beginning with the “Who am I as a (digital) reader?” lesson (pp. 101-102).

The lesson follows the administration of a “Digital Reading Survey.” I took several of the questions from the Turner and Hicks survey and combined them with my pre-existing Interest Inventory and Literacy Diagnostic created with Google Forms. 


I like Google Forms because A) it aggregates the data into easy-to-read representations, B) it allows me to send a link to the teacher-learners so they can complete the inventory on their own time outside of class, and C) FREE.

Once the data is assembled in one place, the teacher-learners are prompted to study the data and make some observations. The idea is to develop a critical and reflexive stance toward “texts, contexts, and attributes that influence reader choices” (p. 100).

We did this on Day 1 of the Fall 2016 semester.

I sent out the link to the form about five days in advance of our first class meeting. Then, on the first day of class, I performed an adaptation of the lesson described on pp. 101-102 of Connected Reading. I taught the same lesson to teacher-learners across two different sections of my content-area literacy course.

Two interesting observations were shared as a result of this exercise.

First, several teacher-learners noticed that a majority of their peers preferred reading on paper over reading on a screen, while an inverse proportion reported going online “several times a day.” If they don’t like screen-reading, what are they doing online several times a day? Presumably some of that time online involves reading, which underscores the idea that I have long advocated that personal preference is not the only “attribute” that mediates our reading choices. We live in a hyper-connected 21st-century context that demands a fair amount of flexibility on our part to move with ease (if not always with pleasure) between mediums.














Next, we noticed that approximately the same number of respondents (an identical percentage, in fact, in one section) said A) they read less because of electronic devices and B) these tools should not be used in reading instruction. This prompted one student to wonder if the same students who read less on devices are the ones who don’t believe digital tools can be used to improve reading skills. Similarly, of those students who said they read more because of digital/mobile devices, how many of those same students agree they should be used to teach reading?

Lots of ideas for item analysis and data disaggregation!

There are some things I will do differently next time to improve this lesson: 

  • I like the idea of prompting a reflection from students about “the ways they read each day” before they actually complete the form. After the form, the students are prompted to reflect again on “what I realized about myself as a reader as I answered the survey questions” (p. 101). The pre- and post-reflections can then be compared. However, I am not sure how to elicit the pre-reflection when I am already asking students to complete the form before the first day of class.
  •  I need to be more intentional about presenting the data set as “text” and modeling, by way of a teacher-directed think-aloud, my process of meaning-making through close reading of text. This is my first opportunity to reinforce key concepts presented in the NCTE policy brief Reading Instruction for All Students, which is mandatory first-day reading. The brief provides a nice overview of the highly contested terms “text complexity” and “close reading,” which were brought to the reading education foreground by the Core Standards. My students, many of whom were in high school when the standards were first being adopted and implemented, often come into the teacher education program with vague and misguided understandings of these terms. 
  •  I can further scaffold the teacher-learners by providing prompts and cues similar to  the “Looking at Data” protocols used in school reform initiatives.

Share/Bookmark

No comments:

Post a Comment

Be nice! And thanks for visiting my blog!