Tuesday, June 16, 2015

Literacy Update: Finishing the Year

Eight days left of school, 23 students left to assess. We should be able to get most of them done. As we think about next year, we may want to brainstorm ways to be more efficient with our time. Assessments can take a very long time if:

  • A student reads very slowly and/or takes a lot of time to answer questions
  • The texts (especially at the higher levels) are longer and more complex.
For students with disabilities reading a Y text, this could take an entire 40 minutes.

That said, we have left:
  • 5 sixth graders
  • 5 seventh graders
  • 8 eighth graders (two of whom are in process)
  • 5 963 students
Some notes about how growth can currently be seen. This is another area we should discuss for next year:
  • The "Spring Chart" tab shows students where they currently fall. The number in parenthesis next to their name indicates how many TC levels higher this is than in the spring. A tilde means the student is at the same level. The word "new" means that this student's first assessment took place in the spring.
  • The "Growth Chart" tab shows students' TC grade level equivalents and the grade level growth they have made since the fall. Note that a student could move up 3 TC levels but only one grade level, depending on where along the TC spectrum they are. (e.g. moving from a Q to a T is 3 TC levels, but only one grade level (4th to 5th))
Some notes about how we might interpret this data:
  • Clearly, "growth" in TC levels appears more impressive than when converted to grade level, but it is not exactly truthful in terms of showing growth.
  • How should we feel about students making one grade-level in growth? Moving one grade level from year beginning to year end means the student is making pace. Is this the goal? Or should we be striving for more than one grade level of growth?
  • Should the Spring Chart have the colors shifted to reflect the fact that these will be the incoming levels of these students in their new grade?
  • Students who score a Z+ cannot show growth. This is problematic for our data system as some of our top learners will not be in "the pool of growth data" from these assessments.
As you examine these tabs and as we think about next year I'm curious what else you notice and what revisions you might suggest.

Enjoy!

Monday, April 13, 2015

On Leveled Libraries

Thought I would update you on some revisions to our classroom library in the seventh grade. We are moving away from having every "letter level" labeled in the library toward more general ranges. These ranges loosely correlate to the F&P grade-level bands.

We are using colors as well, instead of letters or numbers. They are as follows:

  • Blue = QRS (and in some cases lower than Q / roughly 4th grade level)
  • Green = TUV (roughly 5th grade level)
  • Maroon = WX (roughly 6th grade level)
  • Yellow = Y (as well as some previously-identified Zs leveled that way mostly because of "mature content" and not because of text complexity) / roughly 7th grade level)
  • Z's will stay labeled as Z's and will still be separated out.


Most books now have a colored sticker on the front to indicate which band they belong to.

There are several benefits to this method:

1) It removes, albeit slightly, some of the stigma associated with choosing a book from a discrete leveled bin (and the pretense that books can be scientifically assigned to an exactly accurate 'level').

2) It simplifies the task of deciding how to level books that have not been officially leveled by F&P---allowing us to determine a more general band.

3) It allows students to quickly determine a general level when choosing from a genre-based bin by looking at the colored stickers.

This from Lucy Calkins, the purveyor and purporter of all things leveled:

We've leveled many, but purposely not all, of the books in every classroom library. The fact that we have leveled these books does not mean that teachers should necessarily convey all of these levels to children. We expect teachers will often make these levels visible on less than half their books (through the use of colored tabs), giving readers the responsibility of choosing appropriate books for themselves by judging unmarked books against the template of leveled books...
We do not imagine a classroom library that is divided into levels as discrete as the levels established by Reading Recovery or by Gay Su Pinnell and Irene Fountas... These levels were designed for either one-to-one tutorials or intensive, small-group guided reading sessions, and in both of these situations a vigilant teacher is present to constantly shepherd children along toward more challenging books.
If a classroom library is divided into micro-levels and each child's independent reading life is slotted into a micro-level, some children might languish at a particular level, and many youngsters might not receive the opportunities to read across a healthy range of somewhat-easier and somewhat-harder books. Most worrisome of all, because we imagine children working often with reading partners who "like to read the same kinds of books as you do," classroom libraries that contain ten micro-levels (instead of say, five more general levels) could inadvertently convey the message that many children as well as many books were off-limits as partners to particular readers...
...Of course leveling books is and always will be a subjective and flawed process; and therefore teachers everywhere should deviate from assigned levels, ours and others, when confident of their rationale, or when particularly knowledgeable about a reader.
Some ideas are worth highlighting here and are important when explaining our rationale for our classroom libraries and independent reading program

  • The notion that every child should be aware of his or her exact level and only read books at that level (which is the system in many schools) is not endorsed by the very founders of this style of reading workshop.
  • Teacher discretion is valuable and important.
  • This does not mean we should not assess students' levels or have leveled ranges in our classroom.
I also want to point out that I think leveling and instruction associated with leveling is different for middle school than elementary school. It is especially different when dealing with students who have significant reading disabilities. Calkins seems most concerned about students "languishing" with lower level books. It seems at our school we are more concerned with students continually choosing books that are not "somewhat-harder" but much too hard. In other words, we sort of have the opposite problem. I think once students hit seventh grade it is much more challenging to encourage those students still reading at a third or fourth grade level to read Amber Brown or Encyclopedia Brown while classmates next to them read Divergent or a steamy Sarah Dessen novel.

I am curious to see who much growth we see in the TC assessments this spring and the implications of what we find for revising our program next year.

 
 

Wednesday, March 18, 2015

Spring TC Assessment Calendar

Below is the intended calendar for spring assessments, which you can use to expect / check on updated leveling information. I'm counting on these assessments taking less time than the fall assessments largely because we will not have word list assessments and I know what level to begin at for each student. The schedule for Ms. Collazo's breakout-ELL students is forthcoming.

MARCH (17)

6TH GRADE

  1. Aaliyah
  2. Ian
  3. Albert
  4. Elijah
  5. Joseph
  6. Roberto
7TH GRADE
  1. Kassandra
  2. Adryanna
  3. Reginald
  4. Dimitri
8TH GRADE
  1. Steven
  2. Makiyah
  3. YuQiang
  4. Kathleen
  5. Leila
  6. Meilani
  7. Michael


APRIL (18)

6TH GRADE
  1. Marlon
  2. Nadav
  3. Pearlasha
  4. Susan
  5. Daniel
  6. Tadberly
7TH GRADE
  1. Maya
  2. Samantha
  3. Tamayu
  4. Faith
  5. Michael
  6. Joshua
8TH GRADE
  1. Eddie
  2. Diamond
  3. Juan
  4. Gina
  5. Brandon
  6. Demi


MAY (15)

6TH GRADE
  1. Raina
  2. Kevin
  3. Layla
  4. Benjamin
  5. Cristian
7TH GRADE
  1. David
  2. Jzeni
  3. Tywan
  4. Elian
  5. Ethan
  6. Alyssa
  7. Mariengis
8TH GRADE
  1. Cristian
  2. Jayden
  3. Jorge


JUNE (25)

6TH GRADE
  1. Alexas
  2. Kayla
  3. Chelsea
  4. Nissi
  5. D'Niko
  6. Jaydali
  7. Starasia
  8. Naelon
  9. Shamar
  10. Taina
7TH GRADE
  1. Carlique
  2. Henky
  3. Joshia
  4. Victoria
  5. Nhyjel
  6. Madison
8TH GRADE
  1. Enam
  2. Kiara
  3. Alexus
  4. Brianna
  5. Elliot
  6. Lyza
  7. Nicholas
  8. Elijah
  9. Kayla


NO ASSESSMENT (TESTED OUT)
  1. Ilan
  2. Xin Yi

March Literacy Update

Here are some thoughts and updates to our literacy program as we enter the spring season:

1. Notice some changes to the literacy document.
  • We now have a tab called "Cal/Text" which shows the dates of exactly when each student was assessed, at what level, and whether they passed. This allows me to quickly know which level (and text set) to begin with as well as to make sure that spring assessments happen roughly six months after fall assessments for each student.
  • We also have a new "growth" tab, which I will update at the completion of each assessment. We will track growth both in level as well as deviation from the target grade level. Note that I am using the number equivalents of the TC letters to enable numerical calculations.
  • Also note the "SMOG" tab has been moved to the UNMS Leveled Text Database so these titles can be integrated into our master list. The "All Texts" tab may be updated by any teacher at any time who would like to create unofficial levels for the many books in our library that are not leveled by F&P. While these are estimates, it will help us get more books in the library that will match our readers. Some guidelines:
    • You may have students help but please review their work and entries. I have trained two students in the method but I still review their decisions before entering them into the database.
    • Only use the official F&P website when assigning an F&P level. Sites such as Scholastic and Booksource do not always have the official levels and/or are inaccurate. (I am in the process of reviewing what is in our list and have found several mistakes already.)
    • If assigning an "unofficial", "smog", or "RC" level, please explain briefly in the notes section your rationale and other pertinent information.
    • All other tabs in the doc should not be touched. They are pivot tables that should update automatically. Please send me any feedback for making this document better.
2. Spring TC assessments will start next week. I still need to finalize the calendar and print out and sort all of the TC "Set 2" assessments. I am also going to update the main pages for each grade (the ones with the narratives) to allow for more dialogue about each student. If anyone has ideas on how this might look, let me know. I'm thinking simply more columns into which classroom teachers (including non-humanities teachers) can include updates, noticings, and interventions. I think it will be important that we identify the moves we're making, especially with our most struggling readers, and the impact they are having. As part of my Literacy PD time I can compile highlights of what we are all doing into a weekly digest and we can discuss further in Humanities Team.

Tuesday, February 3, 2015

Considering Writing Assessments

Assessing for writing using a standardized assessment certainly appears more complex, time-consuming, and variable than assessing for reading. However, it also makes sense to have something. During my Literacy time today I scoured the web for ideas. There isn't too much out there beyond "sample items" from various state tests or multiple-choice questions that somehow purport to measure writing ability.

The TC Writing Assessment is most definitely a commercial product that attempts to level student writing much the same way we level student reading. It appears to involve copious amounts of checklists along a continuum for all three writing types as identified by the Common Core. If incorporated into instruction, student checklists would complement writing instruction and correlate to the rubrics. (My favorite quote: "Who would have thought that checklists could be such a source of energy!")

That said, the assessments are intended to assess writing separately from reading and do provide a wealth of samples, rubrics, and prompts that we could use to assess writing for all of our students, perhaps even as a baseline administered in our classrooms.

I would suggest we at least get the resource and try it out this year. I can choose a group of students to pilot this with, perhaps, or we could try it across our classrooms.

Here is the link to the product, through which you can also view a sample chapter.


Wednesday, December 10, 2014

The Case of the High Reading Level and Low Test Score

Now that we have assessed over 80% of the school, some interesting information is coming to light: patterns, outliers, divergences, that require further inquiry. In the next few posts, I'll look at some of these cases and offer some thoughts. First, the case of the high reading level and low test score:

Generally, we see a correlation between reading level and test score. However, we see some instances of divergence between reading level and test score in several students across all grade levels. Students sometimes demonstrate proficiency up to a level Z while scoring in the low to mid 2 range on the ELA test. What might account for this difference?

One theory is simply test anxiety. Some of these students reported to me that they get very anxious on test days. I should note that their anxiety was also present to some extent during the TC assessments, but these anxieties may be allayed by the low-stakes nature of these assessments as well as the human connection and personal reassurances that accompany them.

Another theory might reflect the fact that TC assessments are not all-encompassing and that background knowledge and interest (as we know) play a huge role in reading ability, especially when decoding is no longer (or less of) a factor. So, we see cases in which students might be proficiently and happily reading Z-level teen romance books but stumble with X-level historical fiction. We also know that the Common Core tests often seem to go out of their way to ensure that the texts are as uninteresting and non-relatable as possible.

Does this mean we force our hands a little heavier into independent reading selections? Probably not, lest we commit readicide. We know from research that student choice is essential for independent reading to work and for students to build reading lives. However, a little nudge here or there, some reading goals, or "reading ladders" might be helpful.

Where I think we definitely can narrow this gap is in the content areas.

Reading in social studies, science, and art serve to bolster background knowledge, word and concept recognition, and familiarity with different text structures and genres. In ELA, too, we may introduce some "genre texts" such as historical fiction and memoir into the fold of the curriculum.

We must do more than just assign these texts, however. We need to cultivate interest and curiosity in them, knowledge of their structures and intent, understanding of their purpose.

What do you think? What other theories come to mind?

Wednesday, December 3, 2014

Introduction

Welcome to what I hope to be a fruitful and ongoing conversation about literacy at our school. For now, the main purpose of this blog will be to update the community on what I am learning about our students' literate lives as well as what I'm learning about patterns across the school and research and ideas for supporting and nurturing our students' growth.

Today, just the basics.

Where can I find information?
The quickest (and most general) information is on the "Rosters" tab of our 2014-2015 UNMS Program doc. You'll notice that each student has a number in the column "RGL", which stands for Reading Grade Level. This is an estimate of each child's reading level in number-format. (TC levels are A through Z and correlation to grade-level is not always exact.)

For more detailed information, check out the Literacy Data '14-'15 doc which contains different views of grade-level information as well as narrative descriptions of each student's reading as ascertained through the Teachers College (TC) Reading and Writing Project running records assessment.

What does this information mean?
Data culled from the TC running records provides just one way we learn about our readers. As you may notice, in some instances, divergences exist between a student's TC reading level and ELA test score, for example. In addition to this data, humanities teachers have routine reading conferences throughout the year with students to learn valuable information about their reading lives, strengths, and weaknesses.

In short, this information is a starting place. A quick snapshot for our community to get a sense of how and how well our students are reading. This is not an objective science so we use all the information we can get to draw conclusions.

What can I do with this information?
Each of us can use this information to support our students with literacy. For starters, our work in professional development with Building Academic Language should have a direct impact on our students' reading. In each content area, we can use strategies to help students choose the right books, make sense of a math word problem, read a section in a history or science textbook, and build academic vocabulary, among other skills.

How I plan to use this blog
I would like to update everyone about once a week on individual students, patterns as a whole, and ideas I come across in various journals and books that are pertinent to this work. Please leave feedback, questions, or follow-up thoughts if you would like.

I will not auto-email the staff with posts. However, you may subscribe to posts either using a feed reader, such as Feedly, or by email.