College Media

Bogus "Best College Newspaper" list out; in other news, sun rises in east, water is wet (Rant)

Others in the mediasphere are commenting on the Princeton Review’s annual “Best College Newspapers” list. The list of the “top” newspapers contains many fine examples of student publications. If you want to see the list, you can register with the Princeton Review (which I refuse to do on principle) or you can check out the list at College Media Matters.

Below is a slightly revised version of what I wrote about this list in 2010. There was a change to the survey question, but it’s still a horrible question.

In summary, the list is complete and utter bull.

How do I know? I mean, it’s the Princeton Review, right? It has “Princeton” in the name, so there must be something there. They must have some pretty impressive methodology to quantify what are the 20 “Best” college newspapers in a country with around 2,000 such newspapers, right? They must have had a huge matrix of quantitative and qualitative measures and operational definitions of “best” to come up with this list.

Sadly, no.

This is the methodology for naming the “top 20 college newspapers”:

The survey has more than 80 questions in four main sections: “About Yourself,” “Your School’s Academics/Administration,” “Students,” and “Life at Your School.” We ask about all sorts of things, from “How many out-of-class hours do you spend studying each day?” to “How do you rate your campus food?” Most questions offer an answer choice on a five-point scale: students fill in one of five boxes on a grid with headers varying by topic (e.g. a range from “Excellent” to “Awful”). All of our 62 ranking lists tallies are based on students’ answers to one or more of these questions with a five-point answer scale. The five-point grid – which is called a Likert scale – is the most commonly used measurement for this type of survey research: consensus-based assessment. Statisticians consider it most accurate as it presents equal amounts of positive or negative positions.

I love how their methodology section uses “Likert Scale” and “Statisticians consider …” as an appeal to authority. Honestly, their rankings (not the ratings) have all the scientific validity of an online poll.

The question they asked students about the newspaper on their college campus: “How do you rate your campus newspaper?” This is different from the 2010 question: “How popular is the newspaper?”

Now, I think it’s a fine idea that someone survey students about how they rate the student newspaper on campus. But it’s not a valid method to establish which is “the best” student newspaper in a country with somewhere in the neighborhood of 2,000 student media outlets.

How on earth do you rank college newspapers based on the opinions of people who have no interaction with other college newspapers? I mean, do most University of Texas students read the Daily Collegian at Penn State?

How do you even rank anything in a survey without using an ordinal survey question?

The answer is: You don’t. Unless you’re peddling some kind of b.s. college ranking book for $23.99.

And this is NOT a knock on any of the papers on the list. I’m pretty sure they are high-quality journalistic outlets. But I haven’t done any research to find that out. And neither has the Princeton Review. That’s the point.

Look, I know people like lists. I get asked frequently to give a list of student news outlets who are doing innovative things online. I can never rank them. Why? Because the most innovative student online outlets all excel at different things. And who am I to quantify which ones are best? Obviously, I’m not thePrinceton Review. Maybe I should just ask some random students off the street how popular the student news web site on their campus is and rank them that way.

I haven’t delved into the other rankings in the Princeton Review‘s survey, although the “Best College Radio Station question” is like unto the newspaper one: “How popular is the radio station?” (How many students surveyed know how to read an Arbitron ratings book?) But if those two are indicative of the kind of high quality survey methodology Princeton Review is basing their “Best” list on, I’m guessing the entire rest of the book is filled with fertilizer as well.

And, yes, I realize they’ve been publishing this since 1992.

Anyone who bases any decision on what college to attend based on this type of pseudo-scientific stuff should probably consider whether going to college is in their best interests to begin with.

Enhanced by Zemanta