Saturday, November 15, 2014

Ranking schools is like ranking foods.

How good is your school?

It's a seemingly simple question. Why is it so difficult to answer? Perhaps we need to start with what makes a school good. A safe environment? A diverse population? A homogenous population? Academic rigor? Social opportunities? Community involvement? Any answer we give will reflect our own prejudices or our own personal needs. Perhaps there are multiple answers, so it's okay. But next we must ask what evidence would show us our preferred qualities exist in a particular school. People move to areas for their schools, so what do they look for? I have spent all my life in schools either as a student or a teacher, and when my brother and I talked about this very thing when he was considering where to build a house, I was stumped. Thinking back, we could only say we "knew" which were good schools and which were bad. That meant, of course, it was based on prejudices neither of us wanted to acknowledge.  The one agreement reached was that the higher the median income of an area, the more likely it was to have good public schools. 

Schools need to sell themselves. Private schools obviously need to attract new families/customers, but public schools need to motivate families to move to the area and teachers to seek jobs there. Self promotion needs more than gut feelings. We go for numbers. Graduation rates, disciplinary referral numbers, student-teacher ratio, per student spending, and of course test scores. All those numbers can be and are manipulated. None of those numbers necessarily means the school is good or bad, but we yearn for some way to make our decisions, so what do we do? Can we ever definitively say one school is better than another?

Think of how we choose food. What makes food good? Is there a number we can use? Our food labels are certainly full of numbers. We periodically focus on the Calories, grams of fat, grams of carbs, RDI of a particular vitamin, etc. None of these definitively make one food better than another. That doesn't mean there's no such thing as good or bad food. I'm sure we can come to some agreement about some foods in either category, but ranking foods would make no sense except as a fun debate among friends. 

Yet, schools are consistently ranked as though we know exactly how to do it. Universities have been ranked for years by US News & World Report, as well as others, and their findings are accepted without much questioning as to how they're decided. The amount of money the university has and is making, along with personal opinion surveys, are all categories in the formula. Did you know that? Would you put those high on the list if you designed the ranking? But the list is not called the richest and most well known universities.  It's called America's Top Universities. The ranking influences many people's college decisions. 

22.5% reputation survey--how well they're regarded by college presidents 
22.5% retention--6 year graduation rate, mostly
20% faculty resources--class size and faculty salaries
12.5% student selectivity--admission rate
10% financial resources--how much money the school has
7.5% graduation rate performance--how well the rate compares with us news' prediction
5% alumni giving--how much money comes in via donations

Are those the categories you would choose and the weights you would give if you made up a list from scratch? Would you look at post graduate employment rate? Price? Population diversity? It doesn't matter. This is the ranking, and to save you some time, Harvard, Yale, and Princeton are always top three. If the same three teams were always winning the Super Bowl, would you consider it a fair system? My brother and I took different career paths. He became an engineer. I became a high school teacher. We needed different strengths in our universities, so the idea of there being a best school even for the two boys in my family seems odd. 

Now Newsweek annually prints a list of the 500 best high schools in the country. If they published a definitive rank of the 500 best restaurants in the nation, would you take it seriously? They have tweaked the formula every year, especially because of the criticism it has received for being heavily populated by schools that either serve high-income areas or selectively screen out disadvantaged students.  Their explanation of their methodology admits they must " address critiques of
past rankings that claim that school performance as measured by average student test scores is as
much or more a function of student background characteristics than of factors within a school’s
control."  That's the fine print, but they don't call the list "schools with the highest state test scores." That would be honest, but not eye-catching. The Newsweek list was originally based on a formula called the challenge index, created by a Washington Post writer. Check it out. It's an interesting number, but that's all. But his list is not called the 1000 most prolific takers of AP tests. That would be honest, but not eye-catching. Saying these list are of the best and most challenging high schools is a ridiculous claim, but it influences policy decisions.

The UK has a brilliant solution to the problem of school rating, and also completely ignores it most of the time. 

The brilliant part involves inspection reports. Ofsted for state schools and ISI for independent schools periodically sends a group of inspectors to observe every aspect of a school for a week. They will observe classes, interview students and teachers, etc. to get as complete a picture as possible, then publish a full report for all to see. The reports are quite thorough, and I've used them when researching schools myself. Here is last year's report on my current schoolThe inspectors are often officials from other schools, so they even get to learn from one another as they go. The process is stressful, and of course the inspectors are observing a school that knows it's being observed, but the system works very well in my opinion. It makes no attempt to rank them, but it explains the logic of the rating in each category extensively. 

Of course, inspection reports too often take a back seat to test scores. Students are judged by them. Teachers are judged by them. Schools are judged, and ranked by them. The national system is judged by them, as they make the national news in a way I've never seen inspection reports. Are the test scores an accurate measure of learning, teaching, or school management? They're certainly treated as though they are. 

Using inspection reports seems like using food labels and taste tests to help decide what food is best for you. Using test scores seems like basing all food decisions on the number of calories and nothing else. You could even rank food that way, if you want to. If it seems crazy to rank foods definitively by one associated number, why would it make sense to do that with schools, which are just as complex?

No comments:

Post a Comment