Taking Stock: How Are Essential Schools Doing?
Figure 1: Essential Schools' Performance: Some Preliminary Figures
Figure 2: Common Measures: What Teachers Feel About Essential Schools
Figure 3: Qualitative Questions to Help Assess Essential Schools
Figure 4: Common Measures: What Students Feel About Essential Schools
Figure 5: Tracking Student Success from Experimental Schools: The Eight-Year Study from the 1930s
Figure 6: When Researchers Visit Your School
Do students in Essential Schools perform better? As results start to come in, the chief problem is how to answer this in thoughtful and precise ways - without losing the Coalition's focus on intellectual depth as defined by each local school community.
A wry joke is making the rounds of the education world, which we heard from Judy Lanier, who is dean of the school of education at Michigan State University. Think of John F. Kennedy in 1960, Lanier suggests, holding a press conference to announce his plan to land an American on the moon. "The project will take ten years," she imagines the President saying, "but the first six years will be spent building a giant telescope so that we can look at the astronaut when he finally gets there."
Looking carefully at how schools are doing has become an overriding concern in American political and educational circles during recent years, and the Coalition of Essential Schools rightly comes in for its fair share of scrutiny. But many in the Coalition fear that, like that giant telescope, the push for accountability may soon overshadow the very classroom change it aims to stimulate. The questions people ask are tough and various, but in the end they come down to one: "Is there any evidence that students do better when they attend Essential Schools?" This goes to the vulnerable heart of Essential school reform; and precisely because it is so complicated, it may tempt schools in the midst of change to offer simplistic or skewed responses that could skirt the real issues involved.
The chief problem in coming to an answer, of course, is that the question of how Essential Schools are doing can be approached from so many different perspectives. How students are doing on conventional measures like basic skills test scores, attendance rates, graduation rates, and college acceptances is easy enough to measure. But how do we track their thoughtfulness, their originality or creativity, their ability to handle the ill-structured problems that show up in real life? If the aim is to tell whether the Coalition itself is a success, do we judge that by the number of schools involved, or by the depth of what is going on in those schools? Is the assessment aimed at quality control--making all Essential Schools "stand for" the same thing--or does it attempt to assess diverse grass-roots efforts despite wide variations in practice? And what is at stake in being seen as a success--higher teacher salaries, more professional development, or a school's very existence in a marketplace that involves more and more choice?
Without a context that acknowledges such questions, any drive for accountability must run aground. Still, self-evaluation is a critical part of the Essential Schools effort, and the struggle to frame that evaluation in terms that are both thoughtful and precise, both qualitative and quantitative, has occupied the Coalition increasingly in recent years. On top of efforts launched by leaders in state government and academic circles, the recent education initiative of the Bush Administration has added impetus to the search for ways to make assessment richer, fairer, and deeper. Key Coalition figures serve on many of the national groups in government, academia, and business that are examining accountability and testing issues. And major projects within the Coalition will soon begin to track student progress over the long term.
Serving as an umbrella over these in-house efforts is a new Coalition unit known as Taking Stock. Under its aegis, a nine-year longitudinal study following Essential School students through high school and the five years afterwards will soon begin. Already underway is a concerted attempt to gather similar information from Coalition member schools on the progress of their students along carefully framed measures. And soon to be formed is a high-level panel of outside advisers to review the work of Taking Stock as well as external studies of the Coalition, and to make recommendations regarding additional assessment research.
What and How to Compare?
One of the key Essential School principles prods teachers to plan their classes backwards from some demonstrable way students can exhibit mastery at the course's end--not by requiring students to regurgitate textbook facts, that is, but by asking them to link concepts across the disciplines, think on their feet, speak and write persuasively about things that matter to them. What should result, the Coalition argues, is a system of assessment that relies primarily on performances or exhibitions (either at a course's end or at graduation), and on portfolios of student work demonstrating progress over time.
Can such performance-based assessments be quantified in a way that will satisfy the current bureaucratic hunger for a high-stakes accountability system to measure our schools? Given the considerable differences among classrooms and school communities that the Coalition fosters, can we devise ways to measure students against some "national standard" of excellence without resorting to simplistic standardized tests? A growing national alliance--in academia, in business, and in government--is alarmed by the notion of a lock-step national curriculum, or of one multiple-choice national exam that would determine every student's chances after secondary school. And for the first time, via several separate public and private groups, substantial money and energy are beginning to pour into finding out the answers.
In the British Isles and elsewhere, examples already exist of record-keeping based on performances and portfolios. In this country, the National Assessment of Educational Progress has begun a pilot study of higher-order thinking skills assessment techniques in science and mathematics. The National Center on Education and the Economy is working on a comprehensive and controversial "New Standards Project" with offices in Pittsburgh, Rochester, and Washington, D.C. And many states, like Vermont and Connecticut, are beginning to include portfolios, open-ended test questions, and "best pieces" in the systematic comparison of student progress. How such assessments are adjusted to reflect differences among schools and student bodies is a subject too complex to describe here, but it relies on visiting teams of teachers trained to evaluate the consistency of scoring practices from school to school. An element of public reporting and display is also present in most such plans, so that the community itself can be the final judge of how well its school is doing.
Given the complexity and expense of developing comparative and quantifiable records of student thoughtfulness at the national level, is it worth it? Many in the Coalition--including Theodore Sizer, its chairman, and Deborah Meier, who heads New York's Central Park East Secondary School--think not. Rather than giving every student in the country the same test, they suggest, why not simply introduce more robust measures of effectiveness at school and community levels-- then let the school focus instead on the last step of accountability, holding up student work for each local community's critical scrutiny? "What is crucial here is who sets the standards," says Ted Sizer. "The local people whose kids' minds are at stake must have direct and meaningful access to those who have the power to make or change their curricula."
Helping that happen--so that state and district officials can craft policies that support those local visions--is the job of the Re:Learning initiative, a joint effort of CES and the Education Commission of the States in Denver. "What we're finding is that if you begin to develop policies without local visions you'll never be able to assess where you are," says Bob Palaich, a senior researcher at ECS. "If you do have that vision, once your district and school agree to focus on Essential School ideas, then you should be able to go ahead and assess students--with authentic performances, performance testing, and basic skills testing." In any case, he notes, the problem of how to compare students nationwide need not be resolved before one can say whether Essential Schools are making progress.
Collecting and Tracking Figures
If each school community would articulate its standards of excellence clearly, and if it routinely exhibited them through portfolios and public performances, local communities would have good evidence of how their students were doing. However, very few of the more than 100 Coalition member schools have reached that point. So how can the public judge whether the Coalition's principles are worth working and paying for?
Individual member schools in the Coalition are expected to keep track of their attendance rates, graduation rates, standardized test scores, and college acceptances. Some schools keep better records than others, but where comparative data is available (especially in those charter schools that have been with the Coalition since its inception in 1984), improvement is clear in all those areas. (See figure, page 5.) California's Pasadena High School, for example, is entering its fourth year of working with Coalition ideas; the 600 students who began ninth grade in 1989 are now juniors with two years of Essential Schooling behind them. The dropout rate at Pasadena was 35 to 40 percent when she arrived four years ago, notes its principal, Judy Codding. "But in that first Coalition class of 600, we can account for all but seven students still being in school," she says. "Attendance in our core Essential School classes is up to around 93 percent. And the percentage of D's and F's in those classes has gone from around 40 percent to 20 to 25 percent." Codding finds this "enormously exciting," and is looking for funds with which she can give the eleventh-grade Stanford Achievement Test this fall to the same students who took the mandated ninth- grade version on entering the Essential School program two years ago. "We also follow their course grades, which reflect how they do on performance-based tests," she says. "We track disciplinary suspensions and expulsions. And of course we're very interested in how many kids are actually going to graduate."
Measures like attendance and discipline may be one indication of student engagement, which is so hard to pin down otherwise. So when schools show dramatic improvements in these areas, Theodore Sizer suggests, one should look carefully at what they are doing right. "I'm beginning to believe, for example, that small schools, or a house system in big schools, do better than big schools on measures like attendance and discipline," he says. "If they know someone notices, kids show up and they don't cut up so much." In schools where such things aren't a problem, as in many affluent suburban schools, he notes, significant changes in student engagement are harder to judge, short of sitting in on classes to observe how actively the students are using their minds.
Even with simpler indicators like attendance, however, schools differ widely in how (or if) they define and report data; and the Coalition has run up against frustrations in trying to assemble comparable information on student progress across member schools. This is a major challenge of the new Taking Stock effort, which aims to gather and coordinate a broad range of information on just how Essential School students fare, and to publish an annual accounting of that information.
Already, the Coalition has commissioned a pilot study for a substantial research project conducted under the Taking Stock umbrella, and its findings have just landed on the Coalition's desks. It will lead to a series of annual "Common Measures" reports, says Rick Lear, the CES senior researcher who coordinated the effort from Providence. "Our goal is first to establish the uniform collection of data around the 'common measures' such as attendance and the like," he says. "Then we'll attempt to establish new 'uncommon measures' to follow students with--ways to track less easily quantified qualities such as thoughtfulness, problem-solving, decency."
Because of differences in how schools collect and record their hard numbers, the most useful part of the School Survey section of this first Common Measures report may be the research team's recommendation for exactly how schools should record information in coming years. But two other sections--a survey of the perceptions of teachers in Coalition member schools and a similar survey comparing involved and non-involved students in some of those schools--provide some qualitative information, which the team hopes can be amplified and refined in coming years.
Teachers who are highly involved in Essential School activities, for example, say they work harder than they did before; but they also enjoy teaching more and are more likely to recommend it as a career. They notice changes in the intellectual habits of their students, although students overall do not report such a change. On pages 6 to 8 of this issue, the survey findings appear in summary; but what does not show up when the data are compiled are the substantial and fascinating variations (revealed in the individual schools' replies) in how various schools are introducing and interpreting Essential School principles. In some schools, for example, Essential school teachers experience more respect from their colleagues; in others, it's quite the opposite. Almost no Essential schools have taken on the challenge of altering their schedules into longer blocks, but where a school has actually done so, even those who are uninvolved have longer periods. Few schools are using portfolio assessments, whether they are very involved in the Coalition or not. Findings like this are primarily useful not because they prove anything, but because they shed light on the complex political issues of change. As ethnographers Donna Muncey and Pat McQuillan point out, teachers in the vanguard are often naive in their expectations or use of power, for example, or whether a school is united in thinking reform is even necessary.
"Reading these results, one becomes sharply aware of the kinds of things that get lost when you try to lump together and quantify data from schools that are proceeding very differently and are at very different stages of the process," says Ted Sizer. "The successes of schools who show major strides may get lost, and schools that haven't come very far may look better than they should. It's part of the frustration of assessing a project that in its very design suggests schools play out the process of change differently."
The fact that more teachers than students appear to be noticing changes in their Essential School experiences does not discourage Sizer. "It reminds us to be realistic," he says. "We're talking about changing the direction of a very large vessel. When you spin the wheel, the people in the pilot house may experience the change, but it takes a long time for the ship to move noticeably." It takes time, Sizer points out, for the changes teachers note--like spending more time talking about learning in faculty meetings--to directly affect the actual experience of students. "If you look at kids who have been at it a long time, you might see more marked changes," he says.
The Anecdotal Evidence
For a good look at students who have been at it a long time, there may be no better person to turn to than Deborah Meier, principal of what is one of the Coalition's most advanced schools, Central Park East Secondary School in New York City's East Harlem District No. 4. The school has a head start over others in the Coalition because it was an Essential School from its start. In 1991 the first class graduated from its Senior Institute, which replaces, at CPESS, the conventional eleventh and twelfth grades. Though a number of the students who entered the Senior Institute two years ago are staying on for another year of study, all who graduated are going on to college, and all but two of these to four-year colleges. Meier has both a strong sense of who those students are and ambitious plans for tracking their future progress.
"Our students are enormously determined, hardworking, and articulate," she says, "and I think that's one reason colleges have been so impressed with them in interviews. They talk easily to adults--about themselves, about education, about the things they are interested in. Because our school is built around conversation, these kids feel at home with different conversational styles among adults--they know their teachers as adults, and so they're used to what adults interested in intellectual things are expecting. This is not true for most high school kids; it's one reason I believe so strongly that schools must be smaller, or broken into smaller units like houses."
In the long run, Meier says, one must measure students' success by how well they do in life--their works and deeds. "If we define 'well-educated' as 'thoughtful and reflective,'" she says, "it's hard to see how nonreflective, nonthoughtful, decontextualized exams could ever capture it." Toward this end, CPESS will track the progress of 50 students who graduated this June, following not only hard data but issues like self-esteem and ability to handle challenge. "This is a lot of work, and I don't know how expensive it will be," Meier says. "But for me it is more useful than any national assessment. For one thing, schools are a wonderful home base for sharing such information, if they really know their students and graduates. It's also a way that communities can choose their schools by finding out what's important to them--and it's perfectly appropriate for schools to boast about different things. Private schools have always been under less pressure to use traditional assessment, for instance, because they could say 100 percent of their students went on to college. But if a school wants to brag about how many of its students go into political life or into the arts, or make a difference somehow, that's a value system too."
Successes like those at Central Park East and Pasadena sometimes lead people to conclude that Essential School principles make the most dramatic changes at big city schools with large minority populations and the problems of urban unrest. But at Brimmer and May, a small private school in Chestnut Hill, Massachusetts, which has just graduated its first class of seniors who started Essential Schooling in the ninth grade, headmistress Anne Reenstierna declares emphatically otherwise. "I've been here for eighteen years, and it's true that our students have always been motivated and good achievers," she says. "But I have seen a real difference in the classrooms since we began in the Coalition. Students don't just answer the questions we ask, listen, and take notes. They are highly articulate; they have opinions on everything and are ready to question your opinions and your facts. They look at questions from a much broader, interdisciplinary perspective--that's very different. I was in a class on the Holocaust recently, and during the discussion students brought in examples from the apartheid system, from the American civil rights movement, and from Kohlberg's levels of moral development. They initiated most of the questions; the teacher hardly spoke at all." Shortly before graduation Brimmer and May seniors wrote extensive evaluations of their high school experience, which were overwhelmingly positive, Reenstierna says. "Now other aspects of school life will need to change to reflect that greater student involvement," she notes. "They will be working more with teachers and administrators on critical issues like curriculum, self-evaluation, and long-range planning."
Can more qualitative anecdotal evidence like this help at all in assessing how the Coalition is doing? To find out, the Taking Stock effort has just hired Donna Muncey, an ethnographer who has spent the last several years documenting and analyzing school change in a number of Coalition schools. She will head an ambitious nine-year study, following 50 to 75 Essential School students through high school and the five years that follow. "The design problems in setting up a study like this are enormous," Muncey says. "But we will probably interview parents, teachers, friends, and employers of the students to see if we can tell whether their school experience actually made a difference in how well they use their minds."
The project is not unprecedented. In the 1930s and early 1940s, the Progressive Education Association tracked 1,500 students from progressive high schools through four years of college, comparing their performance with that of traditionally prepared students. (See story, page 10.) On conventional measures the students did not do much better than their peers, but on tests of problem solving or creativity they did markedly better--and the more boldly their high school had altered its curriculum, the better their performance. The study lends weight, Ted Sizer says, to his hunch that the Coalition schools that are moving most assertively are also the ones where measurable results are most dramatic. "If the faculty identifies what ails a school and takes bold measures to remedy it, you'll see very visible changes in student performance," he says.
Looking at Systemic Change
This is probably the case whether a school sees its problems as unsubtle ones like attendance, or hard-to-quantify ones like intellectual passivity. But a whole different angle on measuring the effectiveness of the Essential School reform effort is to look at whether change is happening system-wide--not only in terms of student achievement, but by assessing climate and policy shifts at the school, district, and state levels. For example, one can gauge success by how much say a school actually has over what it teaches and how. Or one can ask how much planning time its faculty has, how many opportunities teachers have for professional development, even how they are evaluated and how much they are paid. Are the district and state replacing conventional standardized tests with new techniques of performance assessment, to bolster such efforts in individual classrooms? Are teachers being allowed to cross subject-area lines without red tape?
At the Education Commission of the States, the chief challenge for Re:Learning is to line up state and district policies so they support individual schools and teachers in the throes of classroom change. "Until we can fully join these levels so that they work together, the picture is one of progress and halt, progress and halt," says Re:Learning's Bob Palaich. "But there are good signs. A state Re:Learning coordinator can call a high school principal and say, 'I have your proposal in front of me, but how are you going to integrate your vocational ed money into Re:Learning's goals?' When all the different camps-- special ed, Chapter One, accelerated ed, vocational ed, and Re:Learning--stop being independent camps and start working closely together, we will see real movement. We need schools to be internally coherent, not just to have awards on the wall."
These kinds of messy questions tend to frustrate attempts to neatly assess just how the Coalition is doing. For just that reason, outside observers like Donna Muncey and Patrick McQuillan of the School Ethnography Project recommend tracking its progress with measures that clearly identify what level of the system is being assessed. Designing flexible research that spans a long period of time and crosses levels, they suggest, will better reveal answers in their context.
A true and sober assessment of whether Essential schools are really working must probably wait until many years have passed. But Central Park East's Deborah Meier speaks passionately about how to reach answers in the meantime. "What the Coalition is saying about high school education is itself an answer to the question of respectful assessment in this country--by the community, by teachers, and by outsiders," she says. "That's how we should approach it--each school working through its own assessment with its own community and making it public. Commonalities will appear in threads and patterns, and maybe a system of national assessment will be the theory deduced from that collection of assessments." She pauses. "I think of my friends who are well-educated people," she says, "and how different they are. Some hate novels, some can't stand anything but novels. They write in different ways, think in different styles, have different areas of strength. Stop trying to invent the well-educated person that all schools should produce!"
Figure 1: Essential Schools' Performance: Some Preliminary Figures
(Note: Schools in different states and communities collect data in different ways, and students are selected for Essential School programs in different ways. Cross-district comparisons are invalid; bear in mind that these data may legitimately be compared only for past performances or to general districtwide data collected in the same manner. What follows is a sampling of records submitted by schools)
1.Attendance and Drop-out Rates
Central Park East Secondary School (New York City):
Central Park East Secondary School attendance rate, 1988-89: 91%
Central Park East Secondary School drop-out rate, 1988-89: 0%
Hope High School (Providence, RI):
Hope Essential School attendance rate, 1990-91: 83%
Hope Essential School drop-out rate, 1987-88: 9%
Thayer High School (Winchester, NH):
Thayer Essential School drop-out rate, 1990-91: 1%
Westbury High School (Houston, TX):
Westbury Essential School attendance rate, 1988-89: 96%
Paschal High School (Fort Worth, TX):
Paschal Essential School drop-out rate, 1990-91: 0%
Thayer High School (Winchester, NH):
1986 (pre-Essential status) California Achievement Test scores, grades 7-10: 49th percentile
Westbury High School: (Houston, TX)
Westbury Essential School: 82% of ninth graders passed TEAMS tests
Hixson High School (Chattanooga, TN):
Hixson Essential School graduation rate, 1990-91: 92%
Paschal High School (Fort Worth, TX):
Paschal Essential School graduation rate, 1990-91: 80%
Pleasure Ridge Park High School (Jefferson Cty., KY):
Essential School program students with no failures: 81%
University Heights High School (New York City):
In 1989, only 33% of incoming college freshmen in New York possessed a reading level qualifying them to take college classes. After completing the first stage of the University Heights Essential Program, 77% of University Heights students' reading level qualified them to take college classes.
Pleasure Ridge Park High School (Jefferson Cty., KY):
Pleasure Ridge Park Essential School discipline referrals, 1986-87: Pleasure Ridge Park Essential School students made up 20% of the junior class and generated only 14% of junior class disciplinary referrals to assistant principal.
Westbury High School (Houston, TX):
Westbury Essential School discipline referrals, 1988-89: WES students made up 14.5% of school population and generated only 3.75% of overall disciplinary referrals to assistant principal.
4.Pursuit of Higher Education
Hixson High School (Chattanooga, TN):
Hixson Essential School students (who are all "at risk"), going on to higher education, 1990-91: 75%
Hope High School (Providence, RI):
Hope Essential School graduates, 1988-89: 60% went on to higher education
Thayer High School (Winchester, NH):
Thayer Essential School graduates, 1988-89: 55% went on to higher education
Walbrook High School (Baltimore, MD):
Walbrook Essential School graduates, 1988-89: about 50% went on to higher education
Common Measures: What Teachers Feel About Essential Schools
These responses were gathered from 1,762 teachers in 46 Essential schools, by Kyle Peck, a professor of education at Pennsylvania State University. The survey was commissioned by the Coalition as a pilot study only--intended not to be conclusive but to explore what questions might be usefully asked in a continuing survey to be launched by the Coalition's Taking Stock effort this year.
Seventy-one schools were asked to survey all teaching faculty, regardless of the level of Essential School (ES) involvement. Even among the 46 schools that complied, many teachers within the schools did not complete the survey. The level of ES participation by the schools that did return the survey varied from 24 percent to 97 percent.
Teachers reported where they stood on a five-step scale of involvement in the Essential School effort within their buildings; for the purposes of simplicity we will refer here to teachers who called themselves highly involved as HI, and those who said they had lower involvement as LI.
What follows is a distillation of one section of the survey report, which reports on only the statistically significant responses by teachers. They fall roughly into four categories, as follows:
Teachers' and Students' Activities
Qualitative Questions to Help Assess Essential Schools
These questions were used by a 1988 Committee on Evaluation chaired by Gerald Grant of Syracuse University, charged in 1988 with assessing the progress of the Coalition of Essential Schools. Because they are qualitative rather than quantitative research questions, they can provoke useful thought as schools and outsiders look at Essential School changes in individual situations.
Common Measures: What Students Feel About Essential Schools
These responses were requested from students in nine Essential Schools--both students who participated in Essential School activities (called "ES students" here) and those who did not. Surveys were given to an equal number of ES and non-ES students in each school, but responses came in from 427 ES students and 185 non-ES students. (The statistics that result have been adjusted to reflect this.)
Kyle Peck, who headed the Common Measures pilot study from which these preliminary findings come, points out that interpretation of these results must be tempered by an understanding that "Essential School involvement" means different things from school to school, in terms of both the time students spend in ES-related activities and the nature of the activities themselves. Even the "non-ES" students' responses, he warns, cannot be viewed as representative of students in non-Coalition schools; the very fact that a school has chosen to participate in the Coalition may reflect schoolwide differences from non- participating schools.
Finally, data are reported by the students themselves rather than observed by outsiders, and are subject to the widely recognized tendency of adolescents to report immediate rather than long-range reactions. And although the study did ask schools to give the surveys to a representative sample of students in both categories, Peck's group did not monitor this, so the sample may be skewed in unintended ways.
A sampling of the study's findings that were statistically significant follows.
Personal Data and Attitudes
Classroom Work and Studies
Tracking Student Success from Experimental Schools: The Eight-Year Study from the 1930s
Has anyone ever tried to find out systematically whether what students do in high school really matters when they go on to college? The answer is yes. In an eight-year study launched in 1934 and brought to a halt by America's entry into World War II, the Progressive Education Association followed the progress of 1,500 students from thirty progressive high schools through college, comparing their achievement to students from conventional high schools. Their five-volume report, informally known as the Eight-Year Study, has since lapsed into obscurity; but it had widespread effects on how the American secondary school curriculum developed in the decades to follow--and it can provide a useful background to current efforts to assess Essential School progress.
To start, the PEA's Commission on the Relation of School and College got twenty-five leading colleges to agree to admit students from participating schools, even though their high school preparation might not match the conventional distribution of credits. Then it recruited thirty schools and school systems--public and private, senior highs and combined junior highs, large and small, varying widely in student make-up--that were eager to take on the business of examining their own goals and restructuring their curricula accordingly. By removing one key constricting factor--the fear that their students would not get into good colleges--the commission freed schools to try out bold new ways of teaching.
At first teachers did not trust the colleges' promise enough to make real changes in pedagogy and curriculum. But a growing sense of confidence in themselves resulted eventually in thousands of teachers seriously reevaluating what school was for and how students learn. "They found what they sought in the democratic ideal, in the American way of life," wrote Wilford M. Aikin, who chaired the commission. "'The high school in the United States,' they said, 'should be a demonstration, in all phases of its activity, of the kind of life in which we as a people believe.'"
The test was how well students did in college--by the standards of the college, of the students' contemporaries, and of the individual students themselves. The answer was encouraging: students from the experimental schools did a somewhat better job than their counterparts by all three measures--and the more radically the school had restructured its curriculum, the better they did. On conventional tests the experimental group did as well as their peers from traditional schools; but they out-performed their counterparts on tests that measured problem-solving skills, creativity, and the like. And they were more likely to be leaders on their campuses.
"The ways in which these schools were taking risks are very comparable to Essential School ideas," says Theodore Sizer, who chairs the Coalition of Essential Schools. "They believed that classrooms should be student-centered, that every student can learn, that thought should be linked with action." As Essential Schools work toward change, they might recall the words of the study's second volume, Exploring the Curriculum:
"Constant fear of failure, fear of fellow-workers, fear of the administration, fear of the community, fear of not imitating the successful example of someone else who is promoted, fear of change, fear of loss of work, fear of failing to follow the edicts of state departments or colleges of education--such daily fears are almost purely negative in effect. They result in thinking about how to be safe rather than how to be effective. In place of fear, self-confidence will come to the teacher whose fellow-workers and administrative superiors understand and cooperate to work out clearer concepts and new means of achieving them. With every advance will come a corresponding increase in the sense of freedom and release--freedom to think and do; release of all one's energies and capacities."
The findings of the Commission on the Relation of School and College were published by Harper & Row in five volumes in 1942-1943 under the overall title Adventures in American Education . The individual titles are as follows: Wilford M. Aikin, The Story of the Eight-Year Study (1942)
H. H. Giles, S. P. McCutchen, and A. N. Zechiel, Exploring the Curriculum (1942)
Eugene R. Smith and Ralph Tyler, Appraising and Recording Student Progress (1942)
Dean Chamberlin, E. S. Chamberlin, N. E. Drought,and W. E. Scott, Did They Succeed in College? (1942)
Thirty Schools Tell Their Story (1943)
A good discussion of the Eight-Year Study also appears in Bruce R. Thomas, "The School as a Moral Learning Community," in John I. Goodlad et al., The Moral Dimensions of Teaching (San Francisco: Jossey-Bass, 1990), Chapter 9.
When Researchers Visit Your School
Simply because they are trying out new ideas in the classroom, Essential Schools often find themselves examined within an inch of their lives. Policy makers, doctoral candidates, fellow Essential School people, and journalists may descend upon a school in the throes of change, prodding it for symptoms of success or failure. How should a school react, and where does it draw the line between invasion and accountability to the larger community?
Patrick McQuillan and Donna Muncey have given a lot of thought to these issues as they carried out an independent three-year ethnographic study of Essential schools in the process of school change. In a recent paper, they set forth seven guiding questions that schools might ask themselves before they deny or grant access to researchers. The school, they argue, can help shape research projects so they benefit not only the researcher and the larger community but the school itself. It can protect the rights of those who participate in the research, and forestall the tensions it could generate among the staff. And it can go a long way toward preventing the researcher from over-simplifying the issues that face education.
(McQuillan and Muncey's paper, "Protecting the Interests of Your School While Promoting Quality Research: Some Issues to Consider When Allowing Research to Be Conducted in Your School," will appear in the September issue of Executive Educator. Reprints can be had by writing the School Ethnography Project at CES, Box 1969, Brown University, Providence, RI 02912.)