WPRI Report Continued:
How Good a Job Do Wisconsin Schools of Education Do?
By Mark C. Schug, Ph.D. and M. Scott Niederjohn, Ph.D.
March 4, 2008
Our purpose is to determine how effectively schools of education in Wisconsin prepare elementary school teachers to teach in MPS. We undertake the task in light of this context:
Could things be different? Yes, but a significant change would require changing the subject. Our public interest in quality education calls for a marked shift in emphasis within departments and schools of education—away from contextualizing and ideology, toward a new focus on the knowledge and teaching skills new teachers need to improve the academic achievement of their students. No conceptual or technical problem stands in the way of making the shift. The steps to be taken are ones that many educators could take, if they chose to do so.
The question is whether teacher trainers and education researchers can put aside their own special interests—political and theoretical—in order to focus on teaching practices known to be effective. We know from ordinary observation that faculty members in some university departments with professional training missions are capable of focusing their effort in this way. Anecdotes prove nothing general, but they do illustrate possibilities. One of us conducted a review of a department outside the School of Education at UW-Milwaukee a few years ago. Faculty members in this department were a lot like other university faculty members. They tended to be political liberals and, judging by the signs and cartoons on their office doors, they were not friends of George Bush or the Republican Party. But in their work as instructors of undergraduates in a professional training program they were completely dedicated to their department’s mission of helping people overcome particular disabilities. Working from a common knowledge base, they focused on training people who could go into the field and get their jobs done right. No such consensus—of purpose or method—exists in schools and departments of education. There faculty members can’t even agree that using scores from the state’s own curricular examinations is a legitimate way to measure academic achievement, even though doing so has been mandated by state policy.
Design of the Study
We worked with three sources of data to assess the effectiveness of new teachers in MPS. First, we collected information about the programs and teacher candidates of schools and departments of education around the state. We assumed that MPS could hire teachers from any of these state schools and that the full array of programs should therefore be included. To obtain information we surveyed schools and departments of education through e-mails and follow-up telephone calls. As part of this process, we obtained two sets of statewide Praxis II test scores from all schools and departments of education in the state from the Wisconsin DPI.
Second, we used valued-added methodology (VAM) to measure the effectiveness of new MPS elementary school teachers trained at various schools of education, according to their students’ achievement gains. With the cooperation of MPS, we established a database that links individual teachers to the achievement gains of individual elementary school students in their classrooms. The database also links individual teachers to the college or university each one attended.
Third, we invited elementary school teachers with up to six years of experience in MPS to participate in a web-based survey designed to identify the parts of their teacher education programs that they found to be most valuable. We mailed our request for participation to over 1500 new MPS teachers; 193 teachers responded. Approximately 12 percent of the eligible pool provided complete survey responses for use in our analysis. The margin of error for our calculations is 6.65 percent at a 95 percent confidence level. When surveying a finite sample as is the case here, a 12 percent response rate is regarded as statistically acceptable.
New teachers’ content knowledge
To assess teachers’ content knowledge, we asked how well teacher candidates performed on the Praxis II test of content knowledge.
Teacher candidates in Wisconsin are required by PI 34 (a section of the DPI rules governing teacher certification in Wisconsin) to take a content test in order to qualify for state certification. The test currently in use is the Praxis II: Subject Assessment, which is one test from a series of Praxis II tests. The Praxis II is designed to measure knowledge of subjects that K-12 educators will teach. It was developed by the Educational Testing Service (ETS) and is used in many states; it was selected for use in Wisconsin by the DPI. The Praxis II includes both multiple-choice and constructed-response (e.g., short essay) test items.
We obtained a Praxis II report of test scores from the DPI. (The DPI had received the scores from ETS.) We used scores from two of the Praxis II series tests that were most relevant to this study which focuses primarily on the preparation of elementary teachers. The first set was the test scores for the Elementary Education Content Knowledge (Test Code 0014). The second set was the Middle School Subjects: Content Knowledge (Test Code 01460). The scores are summarized in Tables 1-4, below.
Before examining the results, we wish to mention one important caveat. ETS provided the test scores to the DPI for 2005-06. The test-takers who took the Praxis II during this time period probably were admitted to, and probably completed, the teacher training program of the school or department of education identified. But we can’t be sure of that. The scores reported are not necessarily those of program completers. Some measure of error is thus built into analyses based on the scores.
Table 1 shows median Praxis II scores for individuals at UW-System universities who took the Elementary Content Test. Test takers here include prospective elementary but in some cases, such as at UW-Milwaukee, these test-takers are more likely to be Early Childhood majors. The average median score is 169—well above the national median score of 163 and well above the state-set qualifying score of 147. Test-takers at UW-Madison and UW-Stout had the highest median scores of 179. Test-takers at UW-River Falls and UW-Green Bay are next highest, with median scores of 173.5 and 173, respectively—again, well above the national average median score of 163. Students from two schools or departments of education perform below the national median: UW-Milwaukee and UW-Platteville, with median scores of 160 and 161, respectively.
Table 1 also shows that teachers who are certified through a program called MTEC perform quite well on the Praxis II test, with a median score of 173—well above the Wisconsin qualifying score of 147 and above the national median score of 163. Note, however, that the number of MTEC test-takers is quite low.
Since MTEC turns out to be an important focal point in our study, and since it is unlike traditional teacher preparation programs in Wisconsin, we diverge here from the analysis of Praxis II scores to provide a brief description of the program. MTEC is an alternative teacher training program for MPS, launched in 1996. About 700 MTEC teachers currently teach in MPS, and MTEC is the single largest supplier of new MPS teachers, training about 100 teachers per year. MTEC students are, on average, 36 years old. Candidates who complete the program are certified to teach in Wisconsin. The program involves a one-month induction phase in which participants receive basic training in curriculum and instruction. Then they do a two-semester paid apprenticeship in which they work as the teacher of record in an MPS classroom and attend weekly seminars. To be accepted into the MTEC program, applicants must hold a bachelor’s degree; they also must have completed a successful experience in working with children in an urban environment; and they must have obtained passing scores on the Praxis I, a basic skills test that all Wisconsin teacher candidates take. MTEC has a remarkable retention rate. Most MTEC teachers remain in MPS. While it is difficult to calculate a completely accurate number, in 2005-2006, MTEC had an 85 percent retention rate for its teachers; in 2006-2007, it retained 87 percent. Some of this may be the result of financial incentives—loan forgiveness, for example—that are offered to MTEC teachers on condition that they remain in the MPS for a period of time.
Table 2 shows the median Elementary Content Test scores for test-takers at Wisconsin’s private colleges and universities. The average median score is 167—above the national median score of 163 and well above the state-set qualifying score of 147. Test-takers at Maranatha Baptist Bible College perform the highest, with a median score of 181 (note, however, that the number of test-takers there is very low). Test-takers at Wisconsin Lutheran and Marian College show the next highest scores—with medians of 178 and 175.5, respectively. (Here again, the number of test-takers is low.) All these scores are well above the national average median score of 163. Test-takers from four schools perform below the national median: Alverno College, Edgewood College, Lakeland College, and St. Norbert College.
Table 3 shows median scores for individuals at UW-System schools who took the Middle School Content Knowledge Test. We include these results because some teacher training programs (including UW-Milwaukee) require prospective elementary school teachers to take this test instead of the Elementary Content Test because they are seeking a state certification up to Grade 8. The average median score here is 160, close to the national median score of 159 but well above the state-set qualifying score of 146. Test-takers at UW-Madison had the highest median scores, with an average median of 170.5. Test-takers at UW-La Crosse and UW-Green Bay scored next-highest, with median scores of 167 and 165 respectively. These scores are well above the national average median score of 159. UW-Milwaukee had many test takers here (n=127), and they perform above the national median with an average median score of 161. Test-takers from three schools scored below the national median: UW-Superior and UW-Whitewater (at 157) and UW-Stout (well below the national median at 148).
Table 3 also shows that MTEC teachers score above the national median on the Middle School Content Knowledge Test, with a median of 163. Note that the number of MTEC test-takers here is 43—larger than reported in Table 1.
Table 4 shows median test scores on the Middle School Content Knowledge Test for test-takers at Wisconsin’s private colleges and universities. The average median score here is 163—above the national median score of 159 and well above the state-set qualifying score of 146. Test-takers at Ripon College and Viterbo College had the highest scores, with medians of 170 and 168, respectively (again, the number of test-takers for these two schools is low). Except in one case, these scores are well above the national average median of 159. Carthage College is the only private college whose test-takers scored (at 156) below the national median.
It is difficult to draw firm conclusions from the Praxis scores, given the uncertainty we have noted about the extent to which test-takers remained in the institutions’ teacher-candidate pools. Nevertheless, we think four observations stand out.
First, the minimum qualifying score set by the state is well below the national median score. It invites further attention. Several other states including Delaware, Kentucky, Utah, and Vermont have set their qualifying scores on the Praxis II Elementary Content Test somewhat higher than Wisconsin. The University of Utah uses the national median score as the passing score for content tests for which Utah has not yet established a qualifying score. Should the DPI consider raising its qualifying scores on the Praxis II series? Are such scores good predictors of the ability of teachers to increase student achievement gains? Follow-up studies of these and related questions might well suggest a need to boost Wisconsin’s minimum standard.
Second, most teacher candidates in Wisconsin score above the national median in their performance on the Praxis II content tests at the elementary and middle school level. This appears to be true whether the teachers in question are prepared at public or private institutions.
Third, test scores do vary by institution. Scores linked to some schools are well above the national median, while those linked to other schools fall below—sometimes well below. Test-takers at UW-Madison consistently show the state’s highest results. If, as research cited earlier suggests, academic ability is an important variable contributing to teachers’ ability to improve students’ academic achievement, then UW-Madison should be regarded as potentially an important provider of MPS teachers. Currently, few teachers from UW-Madison programs work in MPS.
Fourth, teachers prepared by the MTEC program do as well as or better than test-takers from traditional teacher training programs. This finding also invites further study. MTEC teachers are older, on average, than candidates from traditional teacher training programs, and they are further removed from traditional college studies. Why do they perform better on Praxis tests than teacher candidates who are currently enrolled in a bachelor’s degree program? Are their higher scores explained by the fact that MTEC requires a completed bachelor’s degree as an admission requirement? Are the lower scores for candidates from traditional programs explained by the allocation in those programs of two years of time or more to education courses rather than content courses?
Required credits in education courses
To learn about curricular emphases in training programs for elementary school teachers, we asked how many semester credits in education courses each of the programs requires.
The value of coursework in education is often questioned. Many critics contend that teacher quality would be enhanced if training programs included more coursework in the liberal arts and less in education. With this issue in mind, we repeatedly called and sent e-mails to all the schools and departments of education in the UW-System to learn the number of education credits required of an elementary major. We received responses from seven universities. Based on this limited sample, the lowest total (48 credits) is required at UW-La Crosse; the highest total (72 credits) is required at UW-River Falls. UW-Milwaukee requires 67 credits—the equivalent of about two years of full-time study. We estimate the average to be about 60 education credits. Compare this to the one month of training required of MTEC teachers.
Elementary education majors spend about half their time in college taking education courses. Given what we will see later about the value placed on general education courses by new MPS elementary teachers, and given the strong track record of MTEC teachers in MPS, this emphasis on education coursework seems excessive.
The link between training programs and teachers’ effectiveness
To learn about this link, we asked how new MPS teachers’ training programs correlate with their empirically observed capacity to improve their students’ achievement. This analysis—checking the value added by new teachers according to their teacher preparation program in one large urban district—may be unique in the nation.
We defined “new teacher” as one who had worked for up to six years, full-time, in MPS. Our sample was 160 new teachers. To conduct our analysis, we used a data base developed in cooperation with MPS officials. Teacher and student data were provided by MPS’s Division of Assessment and Accountability. Initial files identified 4857 (mathematics)/4,846 (reading) students (grades 3, 4, and 5) matched with new teachers. From the initial teacher-students file, 870 (mathematics)/865 (reading) students had pre-test and post-test scores linked with teachers whose institutional assignments were identifiable.
Our data show that MPS gets more new teachers (42 percent) from programs at UW-Milwaukee than from any other school. The next most prevalent Wisconsin institution in the sample is UW-Madison, at 8 percent.[i] Because the institutional counts yield very small sample sizes for many colleges and universities, we combined schools into three categories: Wisconsin institutions versus non-Wisconsin institutions; Wisconsin public institutions versus Wisconsin private institutions; and UW-Milwaukee versus other UW-system schools.
Table 5 displays descriptive statistics for these three groupings. The average years of experience for the novice teacher groupings varied from a low of 1.76 years for the UW-Milwaukee group to a high of 2.1 years in the non-Wisconsin teacher group. Sample sizes varied as expected, with most of the teachers coming from public Wisconsin institutions.
We did regression analyses using the pre- and post-test data on student performance gains. The question is whether a novice teacher’s training institution significantly affects gains in student performance. We controlled for student demographic variables typically used in such models: minority status, free and reduced lunch status, English Language Learner status, and special education assignment. Teachers’ years of experience and degree level (Bachelors or Masters) were also included in the regression model.
Table 6 displays the estimated differences between institutional groupings, along with standard error and significance levels.
Taking the Reading Analysis for Wisconsin Institutions Versus Non-Wisconsin Institutions as an example, Table 6 can be interpreted as follows: The estimated growth in student achievement difference between students with teachers from Wisconsin institutions and those with teachers from non-Wisconsin institutions is -0.258. That is, students with teachers from non-Wisconsin institutions are predicted to gain .258 scale score points more than students with teachers from Wisconsin institutions. However, these differences are not statistically significant, as shown by the very large significance level of the estimate (0.936). Such a significance level means there is a 93.6 percent probability that the result could have happened by chance (or that there is only a 6.4 percent chance that this increase in student achievement is related to the difference in teacher training institution). Other institutional groupings in both reading and mathematics can be interpreted similarly. For the purposes of this report, and as can be seen from Table 6, no institutional grouping comparison even approached reaching a level of significance in the regression analyses that would statistically suggest any difference in student achievement obtained by one institutional group versus another. Put more clearly, the available MPS pre-test/post-test scores for third-, fourth-, and fifth-grade reading and math students suggest that student performance gains are not significantly affected by the school of education their teachers attended, after controlling for student demographics and the teachers’ highest degree and experience. The school of education a teacher attends does not appear to have any effect on the achievement of the students in the teacher’s classes.
A number of caveats require mentioning. There was a sampling issue. Our regression analyses were conducted on only one-fifth of the student sample. Many students’ scores were omitted because their teachers were not assigned in the database or their teachers did not have undergraduate institutions of record. Another variable not included in analyses, and potentially one of some importance, was the quality of teachers’ instruction. Teacher evaluations, both by administrators and students, might more clearly identify differences among the teachers. Also, our regression models utilize a pretest/posttest design. Since Wisconsin assesses students in November, a perfect matched sample is not possible. A portion of students’ gain is most certainly affected by the teacher in the following year between the months of September and November. There is currently no way to account for this effect in the model.
Nonetheless, given the caveats, performance gain differences—between students having teachers with degrees from Wisconsin or other states, from Wisconsin public institutions or private institutions, or from UW-Milwaukee or other UW-system schools—are not evident. The statistical evidence is quite clear: teachers trained in the same zip codes where MPS schools are located fare no better in raising student achievement than teachers trained in rural Wisconsin cornfields.
Our findings on this point are compatible with findings H. Gary Cook obtained in a similar study he carried out for MPS, comparing the performance of teachers trained in the MTEC program to other MPS teachers trained in traditional teacher preparation programs.[ii] Using a VAM analysis, Cook found no significant growth differences between MPS students taught by MTEC teachers and those taught by non-MTEC teachers in reading or mathematics at MPS elementary schools.[iii]
MPS teachers rate their training programs
To find out how new MPS teachers rate their teacher training programs, we conducted a survey in cooperation with the MPS Division of Assessment and Accountability.
We distributed a cover letter inviting all MPS elementary school teachers with up to six years of experience (1576 teachers) to participate in an online survey about their training programs.[iv] One hundred ninety one teachers, or approximately 12 percent of the eligible pool, provided complete survey responses for use in our analysis. The margin of error for our calculations is 6.65 percent at a 95 percent confidence level.
Because the sample size was small for many of the individual teacher certification programs, we grouped the schools into five categories: UW-Milwaukee, representing 36 percent of the sample; all other (non-UW-Milwaukee) Wisconsin public colleges and universities, representing 15 percent of the sample; Wisconsin private colleges and universities, representing 24 percent of the sample; colleges and universities not in Wisconsin, representing 5 percent of the sample; and teachers trained at MTEC,[v] representing 21 percent of the sample.[vi] Even with these groupings, some samples remain small.
Table 7 displays some basic characteristics of the teacher cohorts. Interestingly, while the invitation letter was sent only to teachers identified by the MPS human resources department as having six years or less of MPS experience, a small number of teachers reported having more experience in the district than this. Perhaps some teachers included their student teacher service or substitute service to MPS before beginning to teach full-time. Nonetheless, the data show that approximately 17 percent of our sample are in their first or second year in MPS; about 24 percent have three or four years of experience; and about 55 percent have five years of experience. These results vary somewhat by institutional group. UW-M graduates constitute the smallest number of new teachers in the sample, while 30 percent of the out-of-state graduates are first- or second- year MPS teachers. The second part of Table 7 shows this same distribution but for total years of full-time teaching experience irrespective of district.
Table 8 shows responses to a question asking the teachers to state which components of their teacher training programs were most valuable to them. Overwhelmingly, the teachers selected clinical experiences (i.e., student teaching) as the most valuable component. They ranked their teaching methods courses (e.g., courses in how to teach math, reading, science, etc.) as second-most valuable. Academic courses (e.g., courses in history, mathematics, biology, etc.) ranked third. Coming in last—ranked least valuable—were the general professional education (e.g., courses in cultural and psychological foundations of education). In fact, only three teachers in the sample listed such courses as the most valuable component of their teacher certification programs.
Our results are generally congruent with results from other studies[vii] showing that clinical experiences are the program component that teacher candidates typically find most valuable. It is interesting to consider what the reasons might be for this outcome. One possibility is that professors of education make the student teaching experience valuable by virtue of the expertise they bring to bear on supervising student teachers in their placements and by providing on-going, high-quality coaching in concurrent student teaching seminars. In some cases, that explanation is correct. More typically, it is not correct. At least in traditional teacher training programs, faculty members in education foundations courses do not supervise clinical experiences at all, and many faculty members in other departments (nominally responsible for clinical experiences) give them short shrift, avoiding supervisory responsibility altogether or engaging in it very lightly. Direct supervision of student teachers at many UW-System campuses—including UW-Milwaukee and UW-Madison—is turned over in large measure to graduate students, retired teachers, retired school administrators, or other ad hoc staff members. And these staff members, depending on their workloads, typically see student teachers five to six times per semester, for about an hour per time. As a matter of simple arithmetic, therefore, the beneficial effects of student teaching must derive very largely from the experience itself and from instruction provided day in and day out by the K-12 teachers who serve as cooperating teachers.
Actually, this practice makes a lot of sense. It implies tacit recognition of the fact that many tenure-track faculty members are uninterested in, and unqualified to provide help with, details of K-12 classroom practice. What do you do when Jason has a tantrum in the back of the room? What do you do when the lesson you thought would engage the students for 45 minutes is over in 10 minutes and all hell breaks loose? And how often should little Rachel be permitted to go to the bathroom? A professor might have good intuitions or poor ones about such matters, but in either case the quality of the advice she might give would very likely not be grounded in her graduate school training or her subsequent scholarly activity. By default, she or he would address such problems much in the way that parents or flight attendants or park rangers address the day-to-day problems they face in dealing with difficult and unpredictable children and passengers and tourists. That isn’t necessarily a bad thing, but it does argue for turning the job to people who do it all the time—i.e., to classroom teachers who are willing to help. For university faculty members, moreover, working intensively with student teachers is not necessarily a smart career move. Even though student teaching is regarded by nearly everyone as the key to training good teachers, faculty members are rarely rewarded for such work. Tenure, promotion, and salary increases typically go to those who write articles and books and get grant dollars for research. There is not much to be gained at UW-Madison or UW-Milwaukee by supervising student teachers. Better to leave such work to others.
While MPS elementary school teachers assigned high value to their student teaching experiences, they did not find much to value in their general professional education courses. Typically, these are courses in the history and philosophy of education and in cultural and psychological foundations of education. In an effort to understand the low rankings of these courses, we contacted all of the UW-System schools and departments of education to obtain syllabi for the cultural foundations or philosophy of education course that is typically required in their teacher training programs. After numerous requests by e-mail and telephone calls, we were able to come up with eight syllabi to examine. Given the low response rate, our observations are informal and anecdotal.
We found three things. First, these courses vary widely from campus to campus. In some academic disciplines, such as economics, there is widespread agreement on the content and organization of undergraduate and graduate courses. Principles of economics textbooks, for example, are marked by similar organizational patterns reflecting this consensus among economists. No such consensus exists among the faculty members who teach courses in cultural, psychological, historical, or philosophical foundations of education courses. Some stress the historical background of American public education; others stress multiculturalism; others stress current issues in education, including self-actualization, suicide prevention, the dangers of bullying, the recognition of gang paraphernalia, eating disorders, the inadequacy of standardized tests, images of the teacher in popular television shows, and many others. Despite the variation, these courses are included in teacher training programs to address a common curricular requirement. The assignments are also all over the place. Some courses require no examinations or papers. Instead students keep portfolios, write reflection papers, do action research, participate in cooperative learning activities, write autobiographies, explain their personal philosophies, and so forth.
Second, key topics addressed in these courses—urban schooling, social change, culture, the sociology of knowledge, oppression, power, poverty, race, class, gender, and ethnicity—lend themselves readily to analysis from a left-leaning political perspective. Other analyses of the same topics would be possible, of course, but reading lists attached to the syllabi we could examine featured familiar names from the progressive movement (Rousseau, Dewey) and its latter-day followers (James Banks, Jonathan Kozol, and William Ayers). An undergraduate informed by these topics and these reading lists would not learn about the argument for school choice or state assessment programs or merit pay or the concept of human capital, even though those topics are of great importance outside schools of education. Nor would such a student encounter the arguments of education critics such as Milton Friedman, Thomas Sowell, Diane Ravitch, Chester Finn, and E.D Hirsch. The concept of diversity, as understood in these courses, evidently does not include intellectual diversity.
Finally, Table 8 shows that MPS teachers also give academic courses low rankings. How to reconcile this finding with advocacy among many education reformers for more liberal arts courses for education majors? Earlier we cited research by Goldhaber and Brewer in which results showed a significant positive relationship between teachers’ degrees and their students’ achievement in some subjects.[viii]
One possible explanation of the apparent disjuncture of views here has to do with the teaching of reading. The teaching of reading looms very, very large among the instructional tasks of elementary school teachers. For some primary-grades teachers, it is nearly all that matters. And it is not clear how college and university liberal arts courses—in chemistry or history or German, say—could inform the teaching of early reading in any direct manner. Still, that point does not explain why college and university courses in math and composition and history and political science would not be viewed as important by elementary school teachers who also are expected to teach math and composition and history and government. Something is clearly wrong with a system in which the content teachers are expected to teach is not much valued by the teachers. One possible implication is that faculty members in college and university liberal arts departments present their courses poorly, failing to capture the interest of students who will go on to become teachers, and thus share the blame for a strain of anti-intellectualism in the schools. Alternatively, it may be the case that composition, history, government, and similar areas of study undergo such an extreme transformation in the elementary school curriculum that their disciplinary sources, as represented in college and university courses, strike classroom teachers as merely quaint and frivolous. Either way, it is not a good thing.
Table 9 shows responses of new MPS elementary school teachers to a question asking them to rate the general preparation for teaching they received at their teacher education programs. Teachers were first asked to respond to the statement, “I believe that my teacher certification program provided me with sufficient preparation overall for my first teaching assignment.” Over 72 percent of the teachers in the sample either agreed or strongly agreed with this statement, while just under 20 percent disagreed or strongly disagreed. Overall, this appears to be good news. The non-Wisconsin institutions fared the best on this statement.
About 73 percent of MTEC teachers agreed or strongly agreed that they were sufficiently prepared in general for the classroom. The results were far less positive for UW-Milwaukee, where about 60 percent agreed or strongly agreed that they were sufficiently prepared in general for the classroom. Nearly 30 percent of the UW-Milwaukee teachers disagreed or strongly disagreed with this statement.
Table 9 shows the MPS teachers’ responses to the statement, “I believe that my teacher certification program provided me with sufficient teaching skills to be successful in the classroom.” Over 70 percent of the teachers in the sample either agreed or strongly agreed with this statement, while just under 20 percent disagreed or strongly disagreed.
MTEC teachers stated positive ratings of their preparation in teaching skills, with about 75 percent agreeing or strongly agreeing with the prompt—higher than average. The results were less positive for teachers from UW-Milwaukee. Fifty-nine percent of them agreed or strongly agreed with the prompt. UW-Milwaukee also had the highest percentage of teachers who disagreed or strongly disagreed (nearly 30 percent) with the prompt.
Table 9 also shows the MPS teachers’ responses to the statement, “I believe that my teacher certification program provided me with sufficient content knowledge to be successful in the classroom.” Over three-fourths of the teachers (about 77 percent) in the sample either agreed or strongly agreed with this statement, while about 16 percent disagreed or strongly disagreed. Other Wisconsin public colleges and universities actually achieved a 100 percent rating. Again, not all the programs fared the same. MTEC teachers rated their preparation in content knowledge at 60 percent—far below the total percentage. UW-Milwaukee trained teachers rated their program much higher. Seventy-eight percent of UW-Milwaukee trained teachers agreed or strongly agreed with the prompt.
Table 10 shows responses of new MPS elementary school teachers to a set of statements about their preparation to teach in MPS. The first statement asked the teachers to rate how well prepared they were to deal successfully with the diversity of students in their classrooms. On this question, 67 percent of the respondents agreed or strongly agreed with the prompt. Over 23 percent of the teachers disagreed or strongly disagreed with it. Again, this seems to be a positive rating, although not as high as the ratings given in response to previous questions on general preparation for teaching. The responses here were relatively evenly distributed. Perhaps the only surprising finding is that nearly 30 percent of UW-Milwaukee teachers disagreed or strongly disagreed with the prompt. By comparison, teachers from other public colleges and universities gave higher rankings regarding their preparation for dealing with diversity; 71 percent of them agreed or strongly agreed with the prompt, and only 21.5 percent disagreed with it. Given UW-Milwaukee’s stated mission of preparing teachers to work in a diverse environment, one might have expected these ratings to have been reversed.
Table 10 shows MPS teachers’ responses to the statement, “I believe that my teacher certification program provided me with sufficient classroom management skills to work in an urban school environment.” Here, only 47 percent of the teachers agreed or strongly agreed with the prompt, and almost one quarter disagreed or strongly disagreed. This is the second-lowest ranking given to any of the prompts in our survey. It reflects a long-standing criticism of traditional teacher training programs—i.e., that they do a poor job of training teachers to establish orderly classroom environments and to handle discipline problems effectively. Only 35 percent of the teachers trained at UW-Milwaukee—where the explicitly stated mission is to prepare for work in MPS—agreed or strongly agreed with the prompt, and half disagreed or strongly disagreed. The sole exception to this negative pattern of results came from teachers trained in the MTEC program. Over 70 percent of the MTEC teachers agreed or strongly agreed with the statement, while about 23 percent disagreed.
Table 10 also shows MPS teachers’ responses to the statement, “I believe that I was well prepared by my teacher certification program for working in a large urban school system like the Milwaukee Public Schools.” Here, 54 percent of the teachers agreed or strongly agreed, and one third disagreed or strongly disagreed. As with the previous prompt regarding classroom management, this is a relatively low rating. Only 51 percent of the teachers trained at UW-Milwaukee agreed or strongly agreed with the prompt, and nearly 40 percent disagreed or strongly disagreed. Again, this is not a positive result, given UW-Milwaukee’s urban mission. The only consolation UW-Milwaukee might take here is that other Wisconsin state schools did far worse. Only about one-third of other UW-System teachers agreed or strongly agreed that they were well prepared to teach in the MPS, and over 40 percent disagreed or strong disagreed. The sole exception, again, was teachers trained through the MTEC program. Over 70 percent of the MTEC teachers agreed or strongly agreed with the statement that they were well prepared to teach in MPS, and nearly 23 percent disagreed.
Table 11 shows responses of new MPS elementary school teachers to a question about their attitudes toward student achievement. The first prompt is “I believe that teaching to increase academic achievement is the central job of a classroom teacher.” On this statement nearly 90 percent of the teachers agreed or strongly agreed, while less than 10 percent disagreed or strongly disagreed. This is as close to a consensus result as we have seen.
The second prompt in Table 11 is “I believe that my teacher certification program did a good job of preparing me to increase the academic achievement of my students.” Here the responses are not as positive. While nearly 90 percent of the MPS teachers felt that increasing academic achievement was central to the job, only 66 percent of the agreed or strongly agreed that they were well prepared to raise achievement, and about 19 percent disagreed or strongly disagreed. Only 52 percent of the MTEC teachers agreed or strongly agreed with the prompt, compared to 60 percent of the teachers trained at UW-Milwaukee.
Table 12 shows responses from new MPS elementary school teachers to prompts regarding other aspects of their training programs. The first prompt is “The knowledge I learned about how to be an effective teacher from the instructors in my teacher preparation program was similar to what I found I needed to be successful as a classroom teacher.” This statement was intended to measure the congruence between what teachers learned in their certification programs and what they encountered in MPS. Sixty-six percent of the respondents agreed or strongly agreed with the statement, while almost 21 percent disagreed or strongly disagreed. This seems like a positive result, given the long history of complaint among teachers about the gap between theory and practice in teacher training. Teachers trained at other UW-System schools were the most positive. Three fourths of them agreed or strongly agreed with the statement. MTEC teachers gave the next highest ranking; 70 percent of them agreed or strongly agreed. Ratings for teachers trained at UW-Milwaukee were considerably lower; only 57 percent of them agreed or strongly agreed with the prompt.
The second prompt in Table 12 is “I believe that teaching for social justice is the central job of a classroom teacher.” Overall, 45 percent of the teachers agreed or strongly agreed with this prompt, while almost 25 percent disagreed. Teachers from UW-Milwaukee agreed at a rate of 50 percent, while positive ratings were lower for teachers from other schools. It appears that most new MPS teachers do not see teaching for social justice as their main responsibility. The relatively low overall rating on this prompt is perhaps predictable, given that the vast majority of new MPS teachers (over 88 percent) stated in response to an earlier prompt that viewed increasing academic achievement is a central part of their job.
The third prompt in Table 12 is “I believe that my teacher certification program was academically rigorous and challenging.” Over 85 percent of the teachers trained at other UW-System institutions agreed or strongly agreed with this statement. About 70 percent of UW-Milwaukee teachers agreed or strongly agreed; 60 percent of the MTEC teachers agreed or strongly agreed.
Finally, Table 12 shows the teachers’ responses to the statement, “I would recommend my teacher certification program to others.” Overall, almost 73 percent of new MPS elementary teachers trained in all the programs agreed or strongly agreed with this statement; only about 10 percent disagreed. The strongest positive ranking came from teachers trained in other UW-System schools; 93 percent of them agreed or strongly agreed with the prompt. Seventy percent of the MTEC teachers agreed or strongly agreed. The lowest positive ranking (63 percent) came from teachers trained at UW-Milwaukee.
Here is a summary of the survey results.
Four important conclusions can be drawn from this study. First, Wisconsin provides no easy way for school districts to learn about the probable quality of certified teachers in advance of hiring.
Second, while most teacher candidates in Wisconsin score above the national median on the Praxis II exam, MPS is more likely to attract teachers from schools where candidates perform, on average, at the national median. That is because few new teachers from UW-Madison get hired to teach at elementary schools in MPS. MPS is much more likely to hire teachers from UW-Milwaukee and the MTEC program.
Third, the evidence shows that new MPS elementary school teachers are improving the academic achievement of their students, but the student performance gains are not significantly related to the schools or departments of education the teachers attended. Elementary school teachers trained at institutions that specialize in an urban mission—such as UW-Milwaukee and MTEC—perform no better in MPS than teachers trained elsewhere, in terms of the achievement gains they produce.
Fourth, the MTEC alternative teacher preparation program has two clear advantages over traditional programs. First, it is more efficient. Elementary school teachers trained at traditional schools and departments of education take on average about 66 credits of education courses—about half of all their university coursework. These are the courses that teachers surveyed for this study ranked as the least valuable parts of their training programs, by far. In contrast, the MTEC program requires about a month of classroom training and a great deal of on-the-job teaching. Moreover, many MTEC teachers rate the quality of their program more highly than teachers from other programs rate theirs. Yet, MTEC teachers are not more effective. They produce the same achievement gains as teachers trained in traditional programs at UW-Milwaukee and other UW-System schools. MTEC has struggled to figure out what it will take to produce achievement gains over teachers trained in traditional programs. Second, MTEC year-to-year has a high retention rate—over 80 percent in recent years. Milwaukee has a high turnover rate among its teachers—much higher than the rate in other Wisconsin school districts. The fact that MTEC teachers tend to stay with MPS is no small accomplishment.
Our study overall suggests many possible areas for improvement. For example, we might recommend that:
But we choose to refrain from emphasizing a laundry list of specific recommendations for fine tuning Wisconsin’s teacher preparation programs as they relate to MPS. Why? Partly because recommendations of this sort are old news. Similar recommendations have been proposed, and ignored, for a long time. Also, traditional teacher training programs today are restricted in ways that are almost too numerous to count. In other words, it is hard for us to imagine that the system can change itself by adoption of measures aimed at fine tuning. PI 34—the state’s rules governing teacher training—tie the enterprise up and buttress the status quo. Almost always, when leaders at traditional teacher training programs are challenged to defend their programs in response to student or parental complaints, they take refuge in the rules, pointing out they are just following the mandates of PI-34.
It gets worse. Tenure protection for senior faculty members makes it difficult to redesign or abolish certain courses in the teacher preparation program, no matter what program graduates think of them. Those courses belong to somebody, and he or she doesn’t want to lose them. Similarly, college and university chancellors and presidents would be hard pressed to imagine why they should increase admission standards for schools and departments of education when such actions would almost certainly result in reduced enrollments and less revenue for the institution. And throughout the UW-System all the issues surrounding “faculty governance” add to the bureaucratic inflexibility which makes it almost impossible for schools and departments of education to embark on truly innovative approaches.
Our primary recommendation is not for fine tuning. It is that the State of Wisconsin should push its teacher training schools to change the subject—that is, to focus effort and resources sharply on the task of teaching new teachers how to improve the academic achievement of their students, in urban schools and elsewhere. Suitable programs would share many of the features that now exist in charter schools. Toward this end, exemptions from DPI rules should be issued as necessary. A state board—perhaps one appointed by the Board of Regents—could accept applications from interested institutions. New programs would be allowed to experiment. They would be supported in their efforts to attract bright, capable people from any background to serve as leaders. They would set new standards for teachers, admitting only candidates who stood out as smart, well-educated, and hard-working. They would feature intense internships (taking a cue from MTEC) rather than traditional models of student teaching. Their programs would focus on the curriculum and standards for which new teachers would actually be responsible when they begin to teach. And they would strive in all these efforts—and others of their devising—to validate their practices by reference to empirical evidence about the known effects of those practices on students’ academic achievement. National partners such as Teach for America—a program that attracts thousands of college graduates, many from UW-Madison,[ix]38 to teach in urban schools—might be an excellent model.
We also offer two other recommendations.
[i] It should be noted that MTEC does not appear as a separate unit in the frequency counting here because the database uses each teacher’s baccalaureate institution for his or her school of education, and MTEC-trained teachers have earned bachelor’s degrees at other schools.
[ii] Cook, H. Gary. Evaluation of the Milwaukee Teacher Education Center (MTEC) at Elementary Grades. Research Report #0503 (nd). Milwaukee Public Schools Division of Assessment & Accountability.
[iii] Statistical models estimated on our data set also identified no differences in student achievement between students of MTEC and non-MTEC trained teachers.
[iv] The cover letter and survey instrument can be viewed in the appendix of this report.
[vi] Due to rounding errors, these numbers do not add up to 100 percent.
[vii] Levine, Arthur. Educating School Teachers: The Education Schools Project, 2006. http://www.edschools.org/pdf/Educating_Teachers_Report.pdf accessed December 2007
[viii] Goldhaber, D.D., and D.J. Brewer. (1996) Evaluating the Effect of Teacher Degree Level on Educational Performance. Developments in School Finance, pp. 199-210.
[ix] Borsuk, Alan. Teach for American Considers Milwaukee. Milwaukee Journal Sentinel (December 25, 2007), p. 3B.
©2007 Wisconsin Policy Research Institute, Inc. P.O. Box 487 Thiensville, WI 53092