Connecting Classroom and Large Scale Assessment:
The CLR Moderation Process

Findings from the 1996 CLR moderations conducted by the CLR Core Development Group and the Center for Language in Learning for CLR-registered schools

Fieldwork: April-June, 1996

This is a publication of the Center for Language in Learning, a non-profit organization with 501(c)(3) tax exempt status. The mission of the Center is to offer school districts and the concerned public a fair and accurate evaluation system which integrates classroom assessment with teaching and learning. Questions about this report should be addressed to Dr. Mary A. Barr, Center Director, 10610 Quail Canyon Road, El Cajon, CA 92021.


A total of 155 student records from eight schools were moderated at the K-3 grade span, where the emphasis of assessment is on tracing children's progress toward becoming independent readers. The sampling is small and not representative of the total state population as 34% to 86% of the students at each school had been identified for Title 1 assistance and 31% reported a home language other than English.

Data compiled from this year's moderations and depicted in the graph below, however, illustrate the kind of useful information the CLR can provide school staffs and parents about how well their school's literacy program is helping students learn to read. If this data had been drawn from moderation results at one school, for instance, they would learn that 32 (70%) of their 46 third graders demonstrated that they are readers. Three (18%) showed evidence of being exceptionally fluent or advanced readers, 11 (24%) are fluent or proficient readers, and 13 (28%) are moderately fluent or partially proficient readers. These percentages contrast sharply with those of the second graders, approximately half of whom have not yet reached the moderately fluent level. Each of the five developmental levels on the K-3 scale is defined beneath the graph.

A chart demonstrating reading achievement, K-3

1 Beginning reader: Uses just a few successful strategies for tackling print independently. Relies on having another person to read the text aloud. May still be unaware that text carries meaning.

2 Not yet fluent reader: Tackling known and predictable text with growing confidence but still needing support with new and unfamiliar ones. Growing ability to predict meanings and developing strategies to check predictions against other cues such as the illustrations and the print itself.

3 Moderately fluent reader (partially proficient): Well launched on reading but still needs to return to a familiar range of reader text. At the same time, beginning to explore new kinds of texts independently. Beginning to read silently.

4 Fluent reader (proficient): A capable reader who now approaches familiar texts with confidence but still needs support with unfamiliar materials. Beginning to draw inferences from books and stories. Reads independently. Chooses to read silently.

5 Exceptionally fluent reader (advanced): An avid and independent reader who is making choices from a wider range of material. Able to appreciate nuances and subtlety in text.


The emphasis of assessment at this grade span is on tracing the growth of student reading experience. With the current focus in California schools on Grades K-3, this year's sample is small at Grades 4-6 (59 records) and does not include a sampling from Grades 7-8. While the number of records is too small for a representative sampling of student performance, the chart below uses the compiled data to illustrate the movement from inexperience in reading toward the goal that all students become experienced readers of grade level and course specific text. If the information had been drawn from a single school, the faculty would know that this year's fifth graders are demonstrating more reading experience, at least with fifth grade texts, than the sixth graders, so instructional activities and strategies may need to be examined and perhaps adjusted.

A chart demonstrating reading achievement, 4-8

1 Inexperienced reader: Experience as a reader has been limited. Generally chooses to read a very easy and familiar text where illustrations play an important part. Has difficulty with any unfamiliar materials and yet may be able to read own dictated texts confidently. Needs a great deal of support with the reading demands of the classroom. Over dependent on one strategy when reading aloud, often reads word by word. Rarely chooses to read for pleasure.

2 Less experienced reader: Developing fluency as a reader and reading certain kinds of material with confidence. Usually chooses short books with simple narrative shapes and with illustrations. May read these silently; often re-reads favorite books. Reading for pleasure often includes comics and magazines. Needs help with the reading demands of the classroom and especially with using reference and information books.

3 Moderately experienced reader (partially proficient): A confident reader who feels at home with books. Generally reads silently and is developing stamina as a reader. Is able to read for longer periods and cope with more demanding texts, including novels. Willing to reflect on reading and often uses reading in own learning. Selects books independently and can use information books and materials for straightforward reference purposes, but still needs help with unfamiliar material, particularly non-narrative prose.

4 Experienced reader (proficient): A self-motivated, confident and experienced reader who may be pursuing particular interests through reading. Capable of tackling some demanding texts and can cope well with the reading of the curriculum. Reads thoughtfully and appreciates shades of meaning. Capable of locating and drawing on a variety of sources in order to research a topic independently.

5 Exceptionally experienced reader (advanced): An enthusiastic and reflective reader who has strong established tastes in fiction and non-fiction. Enjoys pursuing own reading interests independently. Can handle a wide range and variety of texts, including some adult material. Recognizes that different kinds of text require different styles of reading. Able to evaluate evidence drawn from a variety of information sources. Is developing critical awareness as a reader.


The CLR reading performance scale for Grades 9-12 was tested in classrooms for the second year in 1996. The scale focuses assessment on tracing student progress toward deepening and sharpening their reading abilities, that is, toward becoming accomplished readers. As in last year's pilot, the teachers reported that the scale does indeed reflect the range of desired high school levels of performance. Teachers at a CLR-registered high school this year used the scale to moderate nine records at Grades 9, 11 and 12. Five students, (2 from Grade 9, 2 from Grade 11, and one from Grade 12) demonstrated achievement at the moderately accomplished level. Two, Grades 11 and 9, demonstrated achievement at the accomplished level, and two twelfth graders showed evidence they were at the top of the scale, i.e., exceptionally accomplished. Scale level definitions are as follows:

1 Literal reader: May be able to derive meaning from a variety of texts, yet usually takes them at face value. Is unaware that he/she may rightly challenge the writer's claims, evidence, or ideas, or may legitimately critique a text for style, logic, organization, etc. Expects or settles on one single view of the text. Sees most text as unrelated to life outside of school. May express frustration with density of course texts. Frequently abandons the reading of books, even those he or she has ostensibly chosen. May rely on non-print media to collect information. Relies on others for interpretation of text meanings. Shows a lack of familiarity with common text organizers, e.g., headings, index. Defines him or herself as one who dislikes reading.

2 Less accomplished reader: May see self as a poor reader but can read course text with preparation and support of visual and/or auditory supplement, e.g., graphics, oral readings. Will read assigned texts but does not read for pleasure or for purposes outside of school. May still rely on getting course information from media other than text, including collaborative groups and film or tapes. Is beginning to collaborate with others to construct meaning in text. Tentative use of advance organizers and genre schemas. Can apply prior experience to some aspects of stories and biographies but may be unable to relate his or her own past experience to abstract ideas presented without context or "hands-on" application.

3 Moderately accomplished reader (partially proficient): Often tentative about sharing own interpretation of texts. With preparation and support, can read aloud expressively. Has developed a sense of genre. Shows a willingness to persist with some difficult texts. Is beginning to make associations between texts and personal experience. Can explain the way particular texts are organized to help the reader derive meaning. Is aware, in interpreting texts, of the influence of the context (i.e. period of time, gender/status of author) in which they were written. Developing skill in using text ideas and challenging text assertions. Reads assigned texts.

4 Accomplished reader (proficient): An effective reader of particular genres, can provide convincing evidence of comprehension. Has strategies for unlocking difficult text, including the sharing of initial interpretations with other readers and the author's use of print conventions (punctuation, headings, index). Able to evaluate information from multiple sources, including text and personal experience. Able to explain the bases of contradictory interpretations and previously held misconceptions. Brings knowledge gained from text read outside of class to bear on course work. Selects books for a wide range of purposes. Can manage the reading of long texts outside of class.

5 Exceptionally accomplished reader (advanced): An avid reader who reads easily across the range of purposes for reading: from seeking information to exploring experience. Elaborates on connections he or she is making with text, able to explain how they aid understanding. Can weigh and compare relative strength and weakness, style, structure, credibility, or aesthetics among texts. Able to demonstrate texts' potential for multiple interpretations. Can explain the significance of the social, cultural or political history of a text. Reads aloud fluently, with appropriate expression.


This year's moderations again produced high rates of agreement among teachers in their judgments of student performance. With scale placement as well as student and teacher identification masked on each record prior to the moderations at both school site and regional sessions, 222 out of 227 records (98%) received two matched scores within two readings. Five records were not scored because regional readers determined that there was not enough evidence to assign scale placement. This compares to last year's rate of 88% agreement in a total of 198 records with 24 records unscorable. Of the number scored in 1996, the ratings made by the originating classroom teachers were corroborated by other teachers' placements 75% of the time. The records scored by other teachers at the site were consistent with the scores given by teachers from outside the school 83% of the time. The higher level of consistency produced by site and regional scorers may indicate that originating teachers needed more time to discuss placement decisions with their site peers.

This finding and other trends noted in this year's data will be studied for their impact on procedures for next year's moderations. Any discrepancies between teacher placement and site/regional placements seem, preliminarily, not to be due to inexperience with this kind of assessment, as was thought last year. It is clear that more originating teachers at more schools presented more records for moderation, in some cases the full 20% sampling necessary for school level program review and for validating teacher judgments in individual student reports. With some schools in their second year of CLR use, expansion of the system to all students is part of the site restructuring plan. More assistance for teachers is being planned by site staffs at second and third year schools, e.g., easier computer access and schoolwide scheduling of pupil-free days for summarizing student data collections. Of special interest this year was the preliminary finding that second year CLR students who had become readers by third grade leaped over the bottom rung on Scale 2 at fourth grade. This scale moves beyond assessing how well students read independently to focus on the breadth of their reading experiences. Such a trend may indicate that students are not only maintaining but are continuing to expand their reading abilities.


Critics of conventional assessments charge that their use narrows the curriculum to what can be easily measured and, further, that they cannot provide useful, timely information to use in improving teaching and learning. To study the effects the CLR assessment system, including participation in the moderations, has had on classroom practice, teachers have been asked each year, 1988-1995, to describe the changes they will make in their use of the CLR the following year. This year, an analysis of their responses to this question is again underway as part of a study of the CLR assessment system's reliability and validity. The results of this study, which includes a comprehensive survey of specific teaching practices, will be available in January or February of 1997. Meanwhile, the guide to responsible student assessment recently published by the National Forum on Assessment, Principles and Indicators for Student Assessment Systems, provides a major resource for the standards by which the CLR system can be judged. The guide is available from FairTest, 342 Broadway, Cambridge, MA 02139 @ $10.

Single reflections, chosen at random, give some idea of how the participants felt about the moderations. One parent, present at the site moderation, said she found the process very interesting, noting that "it's great to see the enthusiasm and dedication the teachers have put forth in getting this program off the ground." She liked the idea of the paired readings: "Each one was able to notice a little something additional or different in the various records, which helped me understand the process more fully." Furthermore, she felt that the information "really helps a parent see their child's strengths and where they can encourage them to grow." A principal complimented those participating as "pioneers" and urged them to continue their "good work." A teacher and parent responded to the experience by saying, "I can't believe I am saying this. . . 'it was really fun!' I felt the record showed an accurate picture of the child's reading and writing. The record was so descriptive that it created a clear picture of the child. . . ."

Overview of the 1996 CLR Moderation Process

In May and June of 1996, teachers met for 4-5 hours at three sites--Laytonville, Stockton, and Pomona--to read and score student records during the third annual CLR regional moderations. The moderation process began at their schools, where, with classroom teachers' scores masked, they had shared their own students' records and moderated--that is, they discussed and sometimes modified--their interpretations of the performance scales for measuring reading achievement in the light of these concrete examples. At the regional sessions, teachers representing each site used the same scales to read and score again the records from each other's schools. School and regional placements had to agree or the records went to a third reader, a certified CLR teacher-leader. Records judged as having "not enough evidence" were not scored.

The purposes of the annual moderations are

bullet to monitor increases and declines in student achievement at CLR-registered schools,

bullet to validate the judgments of student progress made by classroom teachers using the CLR,

bullet to determine the support needed by teachers as they use the CLR assessment system.

What's in a California Learning Record?

The CLR system does not require teachers and students to follow a prescribed curriculum and/or products menu. There is, however, a standardized format which includes:

bullet parent descriptions of literacy beyond the school bullet student work samples and self-reflections about academic progress bullet teacher observations/analyses of progress bullet teacher summaries of student achievement with recommendations about what is next to be learned.

This data is collected in the CLR, along with copies of work from student portfolios, selected and attached for moderation purposes.


The California Learning Record's Core Development Group members and others who led the moderations this year are as follows: Leader Dolores Fisette with panel members Stephanie Watson and Gloria Jarrell, northern California; Leader Janet Ghio with panel members Judy Lynch and Muriel Olsen, Central Valley; and Leader Sally Thomas with panel members Alice Kakuda and Elaine Ruffner, southern California.

Other Core Development members conducted site moderations: Dana Arntson, El Cajon; Dolores Fisette, Ukiah and Laytonville; Judy Lynch and Bev Ruby, K-6, Woodland; Muriel Olsen, Dos Palos; Kathy Thompson, K-6, Fresno; Sally Thomas, Los Angeles and Pomona; Stephanie Watson, Eureka.

Much appreciation is due the guests who contributed detailed observations to a study of the moderation process: Aleen Arbaugh, Zane Junior High School, Eureka; Walter Masuda, UC Berkeley doctoral student; Debra Schneider, UC Berkeley doctoral student.

Our thanks to hosts at the following schools/district, which served as sites for the 1996 CLR moderations: Bev Clark, Title 1 administrator, Lincoln Unified District Office in Stockton, Amauri Rodriguez, principal, Lincoln School in Pomona; and Jeanne Casella, principal, Laytonville Elementary & Middle School in Laytonville.

Phyllis Hallam, UC Berkeley doctoral student, prepared the surveys and analyzed the moderation data collected on computer programs designed by Saul