Looking Collaboratively at Student Work: An Essential Toolkit
Some Guidelines for Learning from Student Work
The Collaborative Assessment Conference
The Tuning Protocol: A Process for Reflection on Teacher and Student Work
The Primary Language Record & The California Learning Record
The 'External Review' of Portfolios and Exhibitions
Making the Whole Student Visible: The Descriptive Review of a Child
Surfacing the "Opportunity to Demonstrate" Factor
Sampling a "Vertical Slice" of Student Work
What to Look for in Student Work: Some Standards for 'Authenticity'
Examining Student Work: A Constructivist Protocol
For More Information
Looking closely together at student work can unveil a treasure trove of insights to guide school communities as they reflect on their purpose, assess their progress, and plan strategies for reaching all children better. It's scary work, though, and respectful protocols can help.
The New York Times Science pages recently told the story of the heart surgeons in Maine, New Hampshire, and Vermont-there are only 23 in all-who agreed in 1993 to observe each other regularly in the operating room and share their know-how, insights, and approaches.
In the two years after their nine-month-long project, the death rate among their patients fell by an astonishing 25 percent. Merely by emphasizing teamwork and communication instead of functioning like solitary craftsmen, the study showed, all the doctors brought about major changes in their individual and institutional practices.
For teachers who, like heart surgeons, have traditionally worked as isolated professionals, the experiment holds a powerful lesson. If their goal is to lower the "death rate" of young minds and see them thrive, many educators now emphatically believe, they can do it better together than by working alone.
Like doctors making hospital rounds, architects gathered for a charrette, or lawyers examining clues to build a case, teachers in Essential schools have begun purposefully probing the rich evidence that lies immediately at hand in every school, searching for what it can yield about how students best learn.
They bring to the table their students' writing, math problem-solving, science projects, artwork, and whatever other evidence they can gather-in written notes or audio or video form-of what kids are producing every day.
Instead of disappearing into the bookbag or the wastebasket, these artifacts become a valuable mirror of how the school's practice does or does not reflect its intentions. Unlike a standardized test, their evidence speaks directly and revealingly of what teachers and students actually do and learn. Like a compass reading, it can then translate into informed action: changed perceptions of students; revised curricula and teaching strategies; new goals and a sense of direction for a faculty.
Rather than first focusing on the work's quality, these processes often ask teachers to suspend judgment and describe its qualities-bringing multiple perspectives to bear on what makes students tick and how a school can better reach them.
At first that multiplicity may complicate rather than simplify things, observes Joe McDonald, who has worked with the Coalition of Essential Schools on developing new methods to look at student work. But such discussion does not require consensus. Indeed, when people who come at school change with very different beliefs and assumptions meet to look at student work, their mutual understanding often deepens. Using diplomatic protocols that make communication feel "safe," they often find common ground and can move more surely toward creating the conditions in which teachers and students might do better throughout the system.
Across the country and abroad, school reformers have recognized the pressing need to place actual student work formally and respectfully at the center of both public and private conversations about school. From California to Vermont, from the Coalition of Essential Schools to its many working partners and friends in schools, teaching centers, and universities, people are trying out new tools for making change through that most radical of activities: unarmed discussion.
Though these tools differ, all share a focus on bringing together people across the school community -teachers, parents, students, and outside visitors-to look at work. All aim to learn something that will then affect future teaching and learning, not just the individual student whose work they examine. And all provide a formal structure, or "protocol," that, while often uncomfortable at first, surfaces and values different points of view.
In his work with the Coalition, David Allen has compared such protocols to putting on a play, "though the dialogue," he notes, "is mainly improvisational." Yet just as theatrical styles that range from classical to "method" can all work magic on the mind and soul, effective protocols have their styles and purposes too.
Some fall on the more evaluative end of the spectrum, aiming to analyze and thus improve teaching strategies and curriculum. Others rely more on close description to heighten teachers' understanding of individual children and hence affect their practice. Some look at a moment in time and extend its meaning outward; others take an accumulated body of evidence and draw new meanings from its larger picture.
"All these years we have been looking at students' work in order to see whether they have done what we told them to do," says Maine educator Marylyn Wentworth drily. Now a new set of purposes suggests itself, larger than what a red pencil can accomplish. And which process one chooses from the interesting array that has sprung up must grow from how well it suits one's purpose.
"Tuning" the Work Upward
In California, for instance, two groups of students from different houses in the same school used the CES-born "tuning protocol" (see sidebar, page 4) to present project work to each other, critiquing the work in front of an audience of school administrators. In this structured, facilitated discussion model, participants give both "warm," supportive feedback and "cool," more critical feedback to the presenters, who then reflect on it together without interruption, "tuning" their craft much as a musician might tune an instrument to its peak effectiveness.
"The students asked each other hard questions about grading, about standards, about the objectives of the projects and the coaching students received," says Joel Kammer, a school coach who teaches at Piner High School in Santa Rosa and who has written about California's use of the protocol in David Allen's forthcoming book for Teachers College Press. "It helped build community and a sense of common purpose and shared responsibility. And it produced substantial and useful information about student work and the possibilities for improvement."
The tuning protocol is widely used within and between Essential schools-as a means of developing more effective exhibitions and assessments, as a way of developing common standards, and as a means to gather and reflect on ideas for revising classroom practice. Kammer describes, for example, how the staff of a restructuring school uses the same steps to get feedback from a panel of teachers from another school. And he tells how a "critical friends group" of faculty members meets regularly in Piner's library to scrutinize student and teacher work, ask questions, and suggest improvements.
In most instances, the protocol's ritual of presentation and response works to mitigate the defensiveness people typically feel when they present work for public critique. Participants take turns in timed segments and eschew direct response. Even the placement of chairs contributes to the purpose of this technique: neither to argue nor to reach agreement, but to gain the benefits of each other's diverse perspectives.
Nonetheless, Joe McDonald observes, bringing private work into the public eye constitutes a "culturally wrenching act" in most schools.
"Some people would never do this unless there was a clear structure to protect them," says Bill Munro-Leighton, who teaches at Brown School in Louisville. And Ceronne Berkeley of Boston's Center for Collaborative Education recalls that at first she reacted to the protocol with "very real fear that I would be criticized as a teacher."
The tuning protocol can prove especially useful in loaded situations where poor communication is a problem, school people say. Typically, its users pose an important question they hope to answer by a close look at actual work-whether the school needs a new policy on spelling, for example, or how to incorporate writing across the curriculum.
In California, the state Restructuring Initiative has taken the ritual one step further, using it to reflect on change at the system level. At an annual symposium, analysis teams from restructuring schools now go through its "California protocol" to discuss before a reflective audience the "critical questions" they have identified as a result of earlier sessions examining student work.
That state's wide-scale use of both the tuning protocol and its own "meta-protocol" reflects a decision to ground systemic decisions firmly in student work, says Juli Quinn, who works with many Essential schools in Los Angeles. "Otherwise the kinds of things the system does will not be related to what students and teachers need," she observes.
The tuning protocol has proved equally useful in giving shape to conversations among parents and community members about the content and quality of student work. In New York City, University Heights High School asks parents to use it at students' "roundtable exhibitions." And both University Heights and Central Park East Secondary School use it to obtain feedback from outside visitors on their graduation requirements and academic programs.
As these examples show, people tend to use the tuning protocol and its relatives as a concrete way to hold up their practice against some standard, or even to work out what their own standards look like. In that process, however, unexpected new meanings often arise.
"When we look as individual teachers at student work we often see it through the narrow lens of the assignment," says Daniel Baron of the Harmony School and Education Center in Bloomington, Indiana. "But a group looking at it together makes a new meaning, focused not on evaluation but on the much bigger question of what we can learn. Doing this builds a profound sense of community."
That sense of shared interest in the big picture of student learning also shows up in another method of looking at student work: the "Language Record" or "Learning Record" first developed for British primary schools and now used in California, New York, and elsewhere at many levels. This far-reaching description of a child's growing language skills across the curriculum draws the family, the student, and all the student's teachers into observing and discussing his or her learning over time.
Assessing the System
The Learning Record stands out among portfolio assessment techniques because it regards the student's entire life experience as relevant-honoring, for instance, a bilingual child's fluency in another language as a demonstration of literacy and communicativeness. But it holds promise at the systemic level, too, as a reliable and valid portfolio-based picture of a program's effectiveness. Teachers across a system use the same scale to rate students' growing fluency over time, resolving variations among their scoring through a sampling and "moderation" process.
Reflecting on the "authenticity" of a student's learning tasks is still another way to frame a collaborative look at student work in a way that has usefulness both to the teacher and to the system. Fred Newmann at the University of Wisconsin has devised a set of criteria that prompts teachers to think through their work with that quality in mind. His standards emphasize not only higher order thinking skills and application to real life but also the central content and processes of the academic disciplines. Though they can be used to "score" classroom instruction, assessment tasks, and student performances, they make an even more useful filter as schools use student work to prompt long-range plans for raising the level of teaching and learning.
At Harvard University, Dennie Palmer Wolf leads the Performance Assessment Collaboratives in Education (PACE) project in another effort to mine portfolio assessment for what it can reveal about the bigger picture of teaching and learning. Funded by the Rockefeller Foundation and the Annie E. Casey Foundation, teachers from urban middle schools in Fort Worth, Pittsburgh, Rochester, San Diego, and San Francisco have gathered for close looks at portfolios-not only to chart a picture of students' growth over time, but also to raise their schools. consciousness about what opportunities students get to continuously use their minds in more demanding ways.
This kind of protocol can also yield important information, Wolf observes, on how well curriculum connects across the grades. Even in more privileged school communities, she notes, and especially in middle schools, "we often could not find in portfolios a progression of opportunities from simple to demanding between the sixth and the eighth grade. And within any one year kids are typically writing the same thing over and over again in the same context. The topic may change, but not the demand level."
Public discussions about opportunity to learn, Wolf says, typically rely on "easy countables" such as property-tax dollars, numbers of books in the library, or staff education levels. "We need to think more about the conditions-the real quality-that make learning possible," she urges. "If schools want to become accountable, not just "be held. accountable, we must make that visible and discussable."
Seeing the Student Anew
The protocols described thus far fall on the more evaluative end of the spectrum of ways to look at student work. Toward the other end of the continuum lie methods that deliberately steer away from evaluation of student work and toward close description instead.
When a group agrees to withhold judgment and simply describe what each member sees in a piece of student work-absent its context or any other introductory information- a new sense of wonder can emerge about the way the particular student engages with learning, writes Harvard University's Steve Seidel. The "collaborative assessment conference" he developed with colleagues at Project Zero sets the stage for teachers to open themselves to the interests, passions, and direction that reveal themselves subtly in their students. work, and to unearth new means of reaching them.
Describing work closely without leaping to judgment proves extremely difficult in a culture used to a "thumbs up, thumbs down" critical style, Seidel observes. Yet as each participant offers what she notices to the group, the student's work yields new insights-not only about its own complexity, but also about the subjectivity of teaching in general. Because this structure demands questions more than seeking answers, Seidel says, teachers find it both frustrating and exciting.
The Descriptive Review of a Child, developed by Patricia Carini at the Prospect Center for Education and Research in Bennington, Vermont, also emphasizes discovering the whole child and also frames that inquiry around a question the teacher brings to a group of peers (sometimes including the child's parents). But for its "text" it takes a wide range of observed characteristics: physical presence, relationships, disposition, and interests as well as formal learning behaviors. As one by one participants amplify the teacher's description, and as the group then questions and comments on the observations, the child becomes increasingly "visible," writes Rhoda Kanevsky, a teacher in Philadelphia who has used the process for many years. "The child emerges as a unique person who is trying to make sense of the world," she says; and all participants gain new insights into the complex business of teaching and learning.
At New York's Central Park East Elementary School I, teachers gather regularly for such Descriptive Reviews of children, says principal Jane Andrias. But they also use that process for reflecting on curriculum and practice. Last year, for instance, the staff began an inquiry into the school's homework practices, using the descriptive protocol to surface harmonies and discrepancies among different teachers. practices and perceptions. "We began to see change right away," says Andrias. "Now we'll continue to meet this way and reflect on where we're going."
At Pasadena High School, Christelle Estrada says, her "critical friends group" found the purely observational approach to looking at student work so powerful that it has also begun using a similar protocol for peer review among teachers. And at Harmony School, Daniel Baron has devised a "constructivist" descriptive protocol in which students describe the qualities in their own "best work" from any time or place, then look for those same qualities in work they do for assignments.
Perhaps the most sweeping of all descriptive review protocols made its stage debut last January, when one full day's ordinary work collected from a sample of students in a small Minnesota school district came under the lens for collective discussion.
Such a "vertical slice" affords a unique cross-sectional look at the evidence, says Joe McDonald, and can yield powerfully authentic answers for schools struggling to look honestly at the need for change. Already several groups have chewed on the Minnesota slice, and several Coalition member schools are taking a comparable approach to exploring the issues they face.
The Minnesota slice came to the table in heavy brown cardboard boxes, but it might as easily have appeared on a small computer disk, David Niguidula points out. "The conversation grows more comprehensive and rich," he notes, "when you have access to different media-say, a video clip of a student doing peer tutoring-and when you can include people even if they're not all in the same room at once."
The Croton-Harmon, New York school district is one of six pilot sites where "digital portfolios" have provided a way to look at student work over time-work organized on disk according to the school's own goals. Indeed, notes Croton's former superintendent, Sherry King, coming together to design and discuss the digital portfolios brought the entire district together in a coherent way around student work, and helped articulate a common vision to link teaching and learning at every level.
How to Do It, and Why
It's not easy, of course, for a group to suspend judgment as it regards student work together; and a recent Atlas Communities paper (complete with its own protocol) lays out some guidelines for those who try it, no matter which method they select. Not unlike the norms for a good text-based Socratic seminar, these suggestions focus on sticking to the evidence, on understanding where other perspectives arise from, and on identifying patterns that emerge as discussion continues.
In another forthcoming guide from the Atlas Seminar, David Allen, Tina Blythe, and Barbara Powell have useful advice for those who would gather around student work; and they describe ways that several Atlas school communities have made up their own collaborative protocols to suit specific purposes. Indeed, logistical questions-who should participate, when and where to gather, who will facilitate the discussion and how, and which format fits the situation best-loom large in a context already charged with anxiety about exposing one's professional work.
Despite the benefits of hearing from multiple perspectives, for example, many who have used these protocols caution teachers to practice them in a safe environment before trying out broader forums. Joel Kammer writes of witnessing how a politically motivated audience tainted the atmosphere of trust when teachers presented work before an adversarial school board.
Yet some of the most useful feedback in protocols, teachers say, comes when they involve students and parents. "Our conversations took a huge leap forward when students joined the adult audiences at our senior exhibitions," says Allison Rowe at New Hampshire's Souhegan High School. "The protocols taught us to conduct that discourse without saying hurtful things."
No matter how powerful the experience of looking together at work, warns Nancy Mohr, who used several of these protocols for years as principal at University Heights, the challenge of bringing the insights it yields back into daily practice may prove daunting. "It is essential," she says, "to establish a process for taking what we learn from the examination of student work and using it in classrooms." That may involve teaming this work with other forms of professional development that are imbedded in practice, like peer coaching or critical friends groups.
And although it takes time and effort, few who have tried it would give up this simple and powerful practice. "The more I looked, the more I saw," observed Brad Stam of San Francisco's James Lick Middle School teacher after his first collaborative assessment conference.
"It affects everything you do," another teacher observed. "Once you routinely look at the work, you can begin moving from just ideas into your daily practice and planning."
Once begun, that cycle-reflecting together on direct evidence, drawing out its meaning, then folding what we learn back into the daily work-may prove the very engine of school change in the critical years ahead. "I used to think student work was between student and teacher," says Jon Appleby, who teaches at Maine's Noble High School. "Now I think all work should be as public, and as shared, as possible." When a teacher can say that, things have begun to move.
Some Guidelines for Learning from Student Work
In "Learning from Student Work," Eric Buchovecky of the Atlas Communities project has described a collaborative process adapted from the work of Mark Driscoll at Education Development Center and that of Steve Seidel and others at Harvard University's Project Zero. The piece lays out useful reminders for how participants can stay focused on the evidence before them and on listening to multiple perspectives, rather than getting bogged down in assumptions or evaluations. Those norms are summarized with the author's permission here:
When looking for evidence of student thinking:
When listening to colleagues' thinking:
When reflecting on your thinking:
When you reflect on the process of looking at student work, ask:
The Collaborative Assessment Conference
Developed in 1988 by Steve Seidel and his colleagues at Harvard University's Project Zero, the Collaborative Assessment Conference asks teachers to look together at pieces of student work and discuss, quite literally, what they see in the work. Through observing and describing the work, participants practice "looking more and seeing more" of what is in the work. The protocol is based on the notion that students are often working on problems or exploring interests that extend beyond the parameters of the assignment. To see the student's work fully, a teacher may have to look beyond those parameters and, with the help of colleagues less familiar with the child and the assignment, mine the piece for new insights.
Initially intended for use in middle and high schools, the Collaborative Assessment Conference is often used in elementary schools as well. Conferences can focus on all types of student work, though they tend to work best with open-ended assignments (as opposed to worksheets). The process takes anywhere from 45 minutes to an hour and a quarter, using these steps:
1. Getting started. The group chooses a facilitator to keep it focused. Then the presenting teacher gives out copies of the selected work or displays it so all can see it. At this point she says nothing about the work, its context, or the student. The participants read or observe the work in silence, making notes if they like.
2. Describing the work. The facilitator asks, "What do you see?" Participants respond without making judgments about the quality of the work or their personal preferences. If judgments emerge, the facilitator asks the speaker to describe the evidence on which the judgment is based.
3. Raising questions. The facilitator asks, "What questions does this work raise for you?" Group members ask any questions about the work, the child, the assignment, the circumstances of the work, and so forth that have come up for them during the previous steps of the conference. The presenting teacher makes notes, but does not yet respond.
4. Speculating about what the student is working on. The facilitator asks, "What do you think the child is working on?" Based on their reading or observation of the work, participants offer their ideas.
5. Hearing from the presenting teacher. At the facilitator's invitation, the presenting teacher provides her perspective on the work and what she sees in it, responding to the questions raised and adding any other relevant information. She also comments on any unexpected things that she heard in the group's responses and questions.
6. Discussing implications for teaching and learning. The group and the presenting teacher together discuss their thoughts about their own teaching, children's learning, or ways to support this student.
7. Reflecting on the conference. Putting the student work aside, the group reflects together on how they experienced the conference itself.
The Tuning Protocol: A Process for Reflection on Teacher and Student Work
The "tuning protocol" was developed by David Allen and Joe McDonald at the Coalition of Essential Schools primarily for use in looking closely at student exhibitions. In the outline below, unless otherwise noted, time allotments indicated are the suggested minimum for each task.
I. Introduction [10 minutes]. Facilitator briefly introduces protocol goals, norms and agenda. Participants briefly introduce themselves.
II. Teacher Presentation [20 minutes]. Presenter describes the context for student work (its vision, coaching, scoring rubric, etc.) and presents samples of student work (such as photocopied pieces of written work or video clips of an exhibition).
III. Clarifying Questions [5 minutes maximum]. Facilitator judges if questions more properly belong as warm or cool feedback than as clarifiers.
IV. Pause to reflect on warm and cool feedback [2?3 minutes maximum]. Participants make note of "warm," supportive feedback and "cool," more distanced comments (generally no more than one of each).
V. Warm and Cool Feedback [15 minutes]. Participants among themselves share responses to the work and its context; teacher-presenter is silent. Facilitator may lend focus by reminding participants of an area of emphasis supplied by teacher-presenter.
VI. Reflection / Response [15 minutes]. Teacher-presenter reflects on and responds to those comments or questions he or she chooses to. Participants are silent. Facilitator may clarify or lend focus.
VII. Debrief [10 minutes]. Beginning with the teacher-presenter (How did the protocol experience compare with what you expected?"), the group discusses any frustrations, misunderstandings, or positive reactions participants have experienced. More general discussion of the tuning protocol may develop.
Guidelines for Facilitators
2. Be protective of teacher-presenters. By making their work more public, teachers are exposing themselves to kinds of critiques they may not be used to. Inappropriate comments or questions should be recast or withdrawn. Try to determine just how "tough" your presenter wants the feedback to be.
3. Be provocative of substantive discourse. Many presenters may be used to blanket praise. Without thoughtful but probing "cool" questions and comments, they won't benefit from the tuning protocol experience. Presenters often say they'd have liked more cool feedback.
Norms for Participants
2. Contribute to substantive discourse. Without thoughtful but probing "cool" questions and comments, presenters won't benefit from the tuning protocol experience.
3. Be appreciative of the facilitator's role, particularly in regard to following the norms and keeping time. A tuning protocol that doesn't allow for all components (presentation, feedback, response, debrief) to be enacted properly will do a disservice both to the teacher-presenters and to the participants.
The California Protocol
Many teachers in California's Coalition member schools routinely use the tuning protocol to surface issues arising from close examination of student work. But the state's Restructuring Initiative, which funds some 150 schools attempting whole-school reforms, has also adapted and expanded the protocol for a new purpose: to examine how such issues relate to the larger school organization and its aims, and to summarize and assess its progress. Instead of having teachers present student work, the California Protocol has a school's "analysis team" work through an important question (possibly using artifacts from their work) in the presence of a group of reflectors, as follows:
The moderator welcomes participants and reviews the purpose, roles, and guidelines for the Protocol. [5 minutes]
2. Reflectors ask brief questions for clarification, and the Analysis Team responds with succinct information. [5 minutes]
3. Analysis Team gives its analysis. [25 minutes]
4. Reflectors ask brief questions for clarification, and the Analysis Team responds with succinct clarifying information about the Analysis. [5 minutes]
2. The Analysis Team observes and listens in on the feedback process. They may also wish to caucus informally as the feedback emerges and discuss which points to pursue in the Reflection time to follow.
3. Each Reflector Group shares one or two supportive statements and essential questions that push further thought. [5 min.]
Team Reflection and Planning
Debrief and Closure
The Primary Language Record & The California Learning Record
The Primary Language Record
Its users praise the Primary Language Record for its flexibility: teachers decide for themselves the frequency, format, and style of their recorded observations. In every case, a parent interview starts the year; all teachers of the child make notes on her developing literacy; joint conferences with parents and child end the year; and that information influences the next year's planning.
The shared reporting mechanism also encourages a shared view of how language skills develop and provides coherence among teachers across the grades. All teachers use "reading scales" that describe progress across the years from dependence to independence as a reader, and from inexperience to experience in reading texts across the curriculum. Speaking and listening skills are also recorded in many different contexts, from dramatic play to science investigations. The scales have another benefit: they can be analyzed in aggregate to give schools and districts an overall picture of students. language skills and adjust their strategies accordingly.
Teachers affiliated with the New York Assessment Network in New York City have worked with the authors of the Primary Language Record to create their Primary Learning Record, which shares most of its important characteristics.
The California Learning Record
California educators adapted the Primary Language Record in the late 1980s, with permission from its British authors, to test its usefulness in tracking language skills across the curriculum and across grade levels, including in secondary school and with a special emphasis on recognizing the literacy skills of bilingual students. Like its counterpart, the system has teachers meet with parents and students at the start of the year, observe and document student progress in different contexts during the year, and assess progress at year's end while planning further work. It uses the same scales as does the Primary Language Record (above), as well as a new high school scale and another scale (developed by the British) that describes the bilingual child's development in English. Teachers phase in the method, in which most of the evidence of progress comes from teacher observation and student portfolios, over two years of professional development. Student self-assessment is also an important part of the California method at the upper elementary, middle and high school levels.
More recently, California and British educators together developed a method of "moderating" the Learning Record results that has important implications for the use of this model. In this process, teachers meet with each other (and often with parents as well) to look closely at portfolio samples and talk through the ratings they received. This happens not only at the school level but again in district or regional groups, and boosts the reliability of teacher ratings so they might serve as an alternative or complement to norm-referenced tests in evaluating school programs. Moreover, by honoring and recording the student's larger experience with language-before the school experience and outside it, in English and in other languages-the CLR adds meaningful parental and student involvement to the process of looking thoughtfully at student work.
For more information about the Primary Language Record and the California Learning Record, contact the Center for Language in Learning, 10610 Quail Canyon Road, El Cajon, CA 92021; tel. (619) 443-6320 (619) 443-6320 . E-mail: email@example.com; Web address: http://www.electriciti.com/~clrorg/clr.html
The 'External Review' of Portfolios and Exhibitions
Many Coalition schools have begun regularly inviting a panel of outsiders-university people, legislators, members of the business community, and other educators-into the school to review and comment on a sample of student portfolios and exhibitions.
At University Heights High School in the Bronx, the External Review gathers some two dozen outsiders in for three hours to look at one particular student's work across her career at the school. Students at University Heights use their project work (in the humanities; math, science, and technology; and service and health) to demonstrate in portfolios their competence in each of five cross-curricular domains: communicating, crafting, reflecting; taking responsibility for myself and my community; critical thinking and ethical decisionmaking; recognizing patterns and making connections; and working together and resolving conflicts. The Review works like this:
1. The large group splits into five small "base groups," which review the five domains, discussing what the categories mean to them and what their own expectations might be in each domain. [30 minutes]
2. The large group comes together again and splits into five new "domain groups," each of which uses one of the five domains as a focus to describe one student's work in the Senior Portfolio, which contains evidence from throughout her time at the school. [90 minutes]
3. The "base groups" reconvene and members from each domain group report on what they found. Since everyone has been working with the same student's portfolio, they discuss their differing perspectives on the work. [30 minutes]
4. The large group joins to make any recommendations to the school, based on their close look at one portfolio and their insights from small-group reflections. [30 minutes]
At Central Park East Secondary School (CPESS), the External Review consists of a day-long workshop also involving reading, reflection, and discussion about whether the school's graduation standards measure up to outside expectations. The visitors-who are researchers, principals, and teachers from other schools as well as some district and state officials-interview publicly two recent graduates, and examine privately three full "graduation portfolios" compiled by still other recent graduates, then share their reactions frankly among themselves while the faculty looks on. In a variation on this protocol, CPESS has also asked experts in a particular field-say, college writing instructors or scientific researchers-to examine and comment on portfolios of student work in that area.
Making the Whole Student Visible: The Descriptive Review of a Child
At the Prospect Center for Education and Research in Bennington, Vermont, Patricia Carini developed one of the earliest and most influential processes for reflecting on students and their work. As the Center began to archive examples of student work from the Prospect School, an independent school founded in 1965, Carini and her staff recognized the potential for teacher learning through close collaborative looks at such work. The ensuing "Descriptive Review of a Child" comprised a series of rounds of description in which the observations of a number of participants accrue around a few focused questions.
The process aims, writes Rhoda Kanevsky in her essay condensed below, to "make the child visible" as a "unique person who is trying to make sense of the world." Guided by a facilitator, the presenting teacher describes the child; then questions and comments from other participants evoke new information and insights. The intent, she says, is not to change the child but to help the teacher see the child in a new light, and "use the child's interests and values to create harmony in the child's school life." The protocol is summarized as follows:
1. The chairperson convenes the session. The teacher-presenter gives the child's basic statistics: a pseudonym for the sake of privacy, as well as such facts as grade, age, and birth order. The chairperson describes the teacher-presenter's "focusing question" (e.g., "How can I help Jason work more productively with other children in the classroom?").
2. The presenting teacher may describe the classroom context if it would be helpful to participants: the room plan, setting, schedule, etc. Then she describes the child, including both characteristic and unusual behavior, using the prompts in the following categories:
Physical Presence and Gesture. Characteristic gestures and expressions: How are these visible in the child's face, hands, body attitudes? How do they vary, and in response to what circumstances (e.g., indoors and outdoors)? Characteristic level of energy: How would you describe the child's rhythm and pace? How does it vary? How would you describe the child's voice: its rhythm, expressiveness, inflection?
Disposition. How would you describe the child's characteristic temperament and its range (e.g., intense, even, up-and-down)? How are feelings expressed? Fully? Rarely? How do you "read" the child's feelings? Where and how are they visible? What is the child's emotional tone or "color" (e.g., vivid, bright, serene, etc.)?
Relationships with Children and Adults. Does the child have friends? How would you characterize those attachments? Are they consistent? Changeable? Is the child recognized within the group? How is this recognition expressed? Is the child comfortable in the group? How would you describe the child's casual, day-to-day contact with others? How does this daily contact vary? When there are tensions, how do they get resolved? How would you describe the child's relationship to you? To other adults?
Activities and Interests. What are the child's preferred activities? Do these reflect underlying interests that are visible to you? For example, does drawing or story writing center on recurrent and related motifs such as superhuman figures, danger and rescue, volcanoes, and other large-scale events? How would you describe the range of the child's interests? Which interests are intense, passionate? How would you characterize the child's engagement with projects (e.g., quick, methodical, slapdash, thorough)? Is the product important to the child? What is the response to mishaps, frustrations? Are there media that have a strong appeal for the child (e.g., paint, blocks, books, woodworking)?
Formal Learning. What is the child's characteristic approach to a new subject or process or direction? In learning, what does the child rely on (e.g., observation, memory, trial and error, steps and sequence, getting the whole picture, context)? How does that learning approach vary from subject to subject? What is the child's characteristic attitude toward learning? How would you characterize the child as a thinker? What ideas and content have appeal? Is there a speculative streak? A problem-solving one? A gift for analogy and metaphor? For image? For reason and logic? For insight? For intuition? For the imaginative leap? For fantasy? What are the child's preferred subjects? What conventions and skills come easily? Which are hard?
3. The chairperson summarizes the teacher's portrayal, calling attention to any dominant themes or patterns.
4. The chairperson asks for descriptions from others who have worked with or observed the child. The presenter may also report comments from others who are not present.
5. The chairperson briefly describes the child's previous school experience, any important medical data, and any family information directly supplied to the school by the family (not by hearsay). The teacher also reports what she knows directly from the family. Unless the family is included in the Review, the review focuses primarily on what the teacher can do to support the child.
6. After the chairperson restates the focusing question, the participants offer questions or comments. This opens out multiple perspectives and generates new information that may enhance the teacher's insights, expectations, or approach, or may even shift her focusing question itself.
7. The chairperson summarizes this new information, restates the focusing question, and asks for recommendations drawn from both the foregoing description and participants. own experiences and knowledge of other children. These recommendations focus on ways to support the child's strengths (not change the child) and create harmony in his or her school life. They may contradict or build on each other, and the teacher need not comment on them or take them. They serve as a resource for all present.
8. The chair pulls together and critiques the Review, summarizing any themes of the recommendations or follow-up plans.
Condensed with permission from Exploring Values and Standards: Implications for Assessment. New York: NCREST, Teachers College, Columbia University, 1993.
Surfacing the "Opportunity to Demonstrate" Factor
In six urban school districts, Dennie Palmer Wolf's Performance Assessment Collaboratives in Education (PACE) at Harvard University has focused on portfolios as a means to look at learning over time. When PACE teachers come together to look at their students' portfolios, however, they often focus not only on whether substantial learning has taken place over a span of, say, one year. They also ask what opportunities teachers and the curriculum offered students to learn and demonstrate worthwhile things. The protocol looks like this:
1. Teachers bring together (from one heterogeneous class or different classes) at least a dozen samples of portfolios that represent strong work, satisfactory work, and work from students who are struggling.
2. Using examples from previous sessions, experienced teacher-leaders frame the inquiry's dimensions. For example: How rigorous were the assignments? Did the student know the standards for good work? What opportunities did students have to move from first-draft work to better work later on? Is there evidence of supporting conditions: the chance to take work home, conferences, peer critiques?
3. With such questions in mind, teachers read and take notes on two samples from each "performance level" of the portfolios they have brought themselves.
4. In pairs, teachers select and read two samples from each level of their partner's portfolios, taking notes on where they want more information, what questions the work raises for them, and where opportunities for learning might be enhanced. They take turns interviewing each other about their observations, then together list the possibilities for change.
5. The larger group comes together to discuss their observations. What strong practices seem to support improvement, they ask, and which inhibit it or set a low ceiling? What new classroom strategies might emerge from this? Which practices might look promising but prove troubling in practice (such as rubrics that are imposed without reference to the work, or all-purpose reflection sheets photocopied from a textbook)? If possible, they make plans to try these out and come back to the group with the results.
Sampling a "Vertical Slice" of Student Work
What might one learn by examining all the student work produced during a narrow time period by a broad sample of students in a particular school or district? In a 1996 project of the Bush Educational Leaders Program at the University of Minnesota, one Minnesota district agreed to capture such data in a "vertical slice" that would gather one day's "ordinary work" and analyze what it revealed about the purposes of education in the real district they referred to as "Prairieville."
The collection came from a sample of two Prairieville elementary classrooms at each grade in two socio-economically different schools, and from a sample of secondary students that cut across curriculum "levels." Everything students did from the morning of January 10 to noon on the following day-homework, worksheets, artwork, notes, drafts, even discussions or events captured on audiotapes, videotapes, or photographs-was to make up the completed archive. Later, groups of school people pored for two hours over its contents. Then, in a Socratic seminar with the archive as its "text," they discussed the implications of what they saw.
This method is quite new and open to adaptation; in fact, the Essential school people who will try it again at the 1996 CES Fall Forum plan to adapt it to a new purpose and guiding question. In a planning session at Brown University, they came up with the following strategy for those considering a "slice" of their own design:
1. Decide on the purpose of your slice. The Prairieville school district administration, for example, wanted to hold up the daily reality of schooling against the district's stated philosophy. But a school might also use the slice to shed light on a particular problem it faces.
2. Come up with a guiding question. Prairieville asked, for example, "What does this work reveal about the dominant purposes of a Prairieville education? Does it seem to portray different purposes for different students, subjects, schools, or levels of schooling?" In a slice involving one heterogeneously grouped high school, the question might be, "Is class work appropriately challenging all students?"
3. Decide on a sampling strategy. Depending on your purpose, the sample should be distributed across the range of groups you want represented, which may be different schools, socioeconomic concentrations, grade levels, curriculum groupings either formal (such as vocational education, Advanced Placement, or special education) or informal (such as band students). Though this distribution cannot be scientifically prescribed, it will determine how useful the slice proves in answering your guiding question.
4. Identify the methods of the slice. Will you ask only for work on paper or can you collect other artifacts: artwork, photos, audiotapes, videotapes, student logs or reflections, information on what goes on outside of school hours? Will you see the work in context or divorced from assignment sheets, discussions, and the like?
5. Decide on the duration of the slice. Prairieville used a day and a half; depending on your situation you might choose a time period of up to a week. This is a cross-section, not a longitudinal study; and remember, work piles up fast.
6. Arrange the logistics. Someone will need to collect the work; gather parental permission to analyze it; remove from it all identifying names; copy it; create and organize the archive in an accessible form. Funding for this from an interested university or foundation partner could help.
7. Decide how to interrogate the slice. A number of discussion protocols might prove useful. For instance, the Fall Forum used a Socratic seminar conducted in a "fishbowl":
What to Look for in Student Work: Some Standards for 'Authenticity'
What intellectual standards should serve as a foundation for the highest quality teaching and learning? Fred Newmann, who directs the Center on Organization and Restructuring Schools at the University of Wisconsin, has come up with a set of criteria for what he calls "authentic instruction and assessment," which can serve as a resource for reflection by teachers examining student work.
Though Newmann provides rubrics for applying his standards in the fields of mathematics and social studies, he intends them not as a mechanical scoring system (for students, teachers, or schools) but as a stimulus for discussion of larger standards and continuous school improvement. He urges groups of reflective teachers to try them out, debate them, perhaps modify them, and finally decide whether their school should incorporate them into its vision and seriously implement them for all students and teachers.
This complex process, he believes, cannot be done individually; it depends on teachers working in collegial small groups over time to discuss and develop their ideas. In Chicago, eleven Essential schools who are part of the Annenberg Challenge site there have agreed to use Newmann's standards as benchmarks for their collaborative looks at student work both within schools and as part of their "critical friendships" across schools.
Teachers should examine classroom instruction, assessment tasks, and student performance, Newmann believes, for whether they are "authentic" in construction of knowledge, disciplined inquiry, and value beyond school. He defines "authenticity" using the following criteria:
For Assessment Tasks
1. Organization of information. The task asks students to organize, synthesize, interpret, explain, or evaluate complex information in addressing a concept, problem, or issue.
For Classroom Instruction
1. Higher-order thinking. Instruction involves students in manipulating information and ideas by synthesizing, generalizing, explaining, hypothesizing, or arriving at conclusions that produce new meaning and understandings for them.
For Student Performance
Newmann uses these standards to assess the intellectual quality of student performance in mathematics and social studies. With adaptation to the different disciplines, he notes, they can be used to assess the quality of student performance in a variety of academic subjects.
1. Analysis. Mathematics: Student performance demonstrates thinking with mathematical content by organizing, synthesizing, interpreting, hypothesizing, describing patterns, making models or simulations, constructing mathematical arguments, or inventing procedures. Social Studies: Student performance demonstrates higher order thinking with social studies content by organizing, synthesizing, interpreting, evaluating, and hypothesizing to produce comparisons, contrasts, arguments, application of information to new contexts, and consideration of different ideas or points of view.
2. Disciplinary Concepts Mathematics: Student performance demonstrates an understanding of important mathematical ideas that goes beyond application of algorithms by elaborating on definitions, making connections to other mathematical concepts, or making connections to other disciplines. Social Studies: Student performance demonstrates an understanding of ideas, concepts, theories, and principles from social disciplines and civic life by using them to interpret and explain