to speak.
Of course, we were excited and accepted the invitation. I clearly recall a great degree of anxiety, too. After all, we were simply two faculty members from a comprehensive mid-level university who had a background in using, rather than producing, original research results. In retrospect, I believe this was what fascinated Judy the most. She was genuinely interested in the question, How can we at the institute make a positive difference in the improvement of classroom instruction beyond merely engaging in and writing about research on teaching effectiveness?
Our initial visit to Michigan State could not have gone better. Judy, Larry, Jere Brophy, and the entire faculty were welcoming and hospitable. In addition to our presentation to the group, Judy had arranged for many one-on-one conversations with faculty. Although they were very busy with their teaching and their research, they were welcoming and engaging, on both a professional level and a personal level. (We learned early on in our conversation with Jere Brophy that he was a country music fan!)
Judy also hosted an informal gathering at her home that evening where we could mix and mingle with the faculty and her friends. It was truly delightful, and I like to think that it was at this gathering that we really began our friendship with Judy and with Larry. Larry drove us back to the airport, and as we chatted, I remember thinking, This is one of the nicest people I have ever met. Little did I know that Larry and I would be working much more closely together in the future as the research on effective schools began to gain national prominence.
I don’t recall how many times we visited the institute, but each time, I felt we became closer friends with Judy. I remember thinking that there was the possibility of a position offer in the works and wondered how I would respond. On a personal level, my wife, Star, and I wanted very much to remain within driving distance of our family in Chattanooga, and I was concerned about the high-pressure culture of a large, research-oriented university. No such offer ever materialized, although I believe Judy did give the possibility some thought.
A Summing Up
My experience with the Institute for Research on Teaching and Judy Lanier had a positive impact on my career (and personal life) in a number of ways. First, I gained a much deeper knowledge of and interest in research-based approaches for improving classroom effectiveness. In retrospect, I think this played a significant part in the emphasis Rick DuFour and I placed on gaining shared knowledge about best practices and on collaborative teams engaging in action research in the Professional Learning Communities at Work process.
Rick and I both saw the potential for connecting the findings from the effective teaching research to the clinical supervision observation process. In his inimitable, witty way, Rick frequently pointed out the problem of the Now what? question in post-observation conferences. Rick remarked that observations of teachers, even if done well and accurately recorded, are of little value if the person conducting the interview has little to offer when a teacher asks, “I see that I need to improve my instruction in certain areas. What do you suggest I do?” Rick, with what Becky DuFour and I came to refer to as his dripping sarcasm, would observe that at this point most principals and supervisors are left to say, “Well, actually, I don’t have a clue about how you can improve your teaching or classroom management practices. You see, my skills are in observing and recording, not in the knowledge of effective teaching practices. Sorry.”
The research findings on effective teaching didn’t just help teachers; they also armed observers with a knowledge base that enhanced the effectiveness of the clinical supervision process. Jerry Bellon quickly saw the value in this emerging field of research and began including examples of the research findings in his consulting work with districts. In fact, within a few years, Jerry coauthored one of the first books that synthesized many of the research findings into a handbook for improving classroom instruction (Bellon, Bellon, & Blank, 1992).
Second, I gained confidence in working with schools and school districts and in speaking to groups about effective, research-based teaching practices. This knowledge and confidence enabled me to work with many more districts across the United States. Interestingly, this also changed my relationship with Jerry Bellon. He had viewed me, appropriately, as an associate who helped him work with districts that were interested in improving teaching through the clinical supervision process. As a result of my association with the Institute for Research on Teaching and my friendships with Judy Lanier and Larry Lezotte, as well as my individual work with districts regarding the teacher effects research, he came to view me as a person with my own area of expertise and, to some extent, a growing national reputation.
Third, my professional reputation was given a huge boost by an interview conducted by Willard Duckett, who was the assistant director of the Phi Delta Kappa Center on Evaluation, Development, and Research, that appeared in the Phi Delta Kappan in 1986. Duckett (1986) began the article by noting that the center and the Kappan were undertaking an initiative to introduce readers to individuals “who make exemplary contributions to research or who make effective, practical applications of research in the administration of public schools” (p. 16). Although I never asked, I always felt the interview was the result of some intervention by Judy Lanier. For this alone, but for much more, I have always been grateful for Judy’s friendship and support during this period of my professional journey.
The article certainly enhanced my national exposure. I think the point that received the most attention was the idea of “legitimizing” research, which Jim Huffman and I had picked up from Herbert Lionberger:
Duckett: By legitimizing, you obviously mean something more than passive acceptance.
Eaker: Exactly, it’s the process of becoming convinced, as opposed to being informed. Being informed doesn’t motivate one to do much. Legitimizing an idea is a process of dispelling fears or inhibitions and coming around to a favorable disposition leading to acceptance. When an idea has been legitimized, one is willing to act on it.
Duckett: Give me an example of how a consumer might legitimize, or go through experiential validation, with regard to research data.
Eaker: Manufacturers often do extensive testing of their products in both the development and the production stages; it is generally to their advantage, in a competitive market, to at least advertise supporting data. General Motors, for example, will have elaborate data on the performance capabilities of Car X. Those data might reveal that Car X can be expected to get between 16 to 31 miles per gallon of fuel (quite an indeterminate range, incidentally). Regardless of how thoroughly Car X was tested, you and I, as consumers, are probably skeptical about how closely “laboratory” conditions at GM match our own local driving conditions. Without questioning the validity of the GM tests, you and I will probably prefer to legitimize or validate such data at the experimental level. So, we ask our neighbors or co-workers about their experiences with Car X. We seek to validate the data in terms of our normal use of a car in everyday driving. To what extent will our driving habits approximate those of the professional drivers at the GM test ranges? How many variables in our environment approximate those at GM? In short, can we expect to average closer to 31 miles per gallon than to 16? If not, why not?
Thus, in seeking to validate the research data from a consumer’s point of view, we do not repeat the manufacturer’s tests, nor do we mount a formal challenge to the methodology. We simply set out to validate by determining the appropriateness of the data for use in our own specific situations.
Duckett: Does the analogy hold for teachers?
Eaker: Yes, we [Jim Huffman and I] think that teachers react to educational research reports in much the same way. Teachers can easily be informed about the research coming from Stanford, the University of Texas, or Michigan State University. But they want to know whether their own classrooms are similar enough to those of the experimental groups to lead them to expect similar results. Their questions go something like this: “Okay, these classroom variables worked well for the research group, but will they work in my classroom?” Our answer is simply, “Let’s check them out. Let’s check the applicability of that research for your situation, your instructional context.” That is what I mean by experiential validation