Murphy, Evaluation of a Website: RateMyProfessors.com

The following student essay includes all the elements of an evaluation argument. The student who wrote the essay was evaluating a popular website, RateMyProfessors.com.

EVALUATION OF A WEBSITE: RATEMYPROFESSORS.COM

KEVIN MURPHY

1

Thesis statement

Since 1999, both students and professors have been writing, reading, defending, and criticizing the content on RateMyProfessors.com (RMP). With over 15 million student-written reviews and over 4 million visitors a month, RMP continues to be the most popular site of its kind (“About RateMyProfessors.com”). However, the fact that a website is popular does not mean that it is reliable. Certainly RMP may be interesting and entertaining (and even, as New York Times writer Virginia Heffernan recently wrote, “engrossing”), but is it useful? Will it help students to make informed decisions about the schools they choose to attend and the classes they choose to take? Are the ratings—as well as the site itself—trustworthy? Is the information about professors and schools comprehensive enough to be meaningful? No student wants to waste time in a course that is poorly taught by a teacher who lacks enthusiasm, knowledge, or objectivity. However, an evaluation of the reviews on RateMyProfessors.com suggests that the site is not trustworthy or comprehensive enough to help college students make the right choices about the courses they take.

2

Evidence: First point in support of thesis

The first question to ask about the reviews on RMP is, “Who is writing them?” All reviews on the site are anonymous, and although anonymity protects the writers’ privacy and may encourage them to offer honest feedback, it is also a red flag. There is no guarantee that the reviews are written by students. In fact, anyone—even the professors themselves—can create RMP accounts and post reviews, and there is no way of knowing who is writing or what a writer’s motivations and biases are. In addition, the percentage of students who actually write reviews is small. According to one recent survey, only 8 percent of students have ever written a review for an online professor-rating site; in other words, “a vocal minority” is running the show (Arden). Furthermore, the ratings for each individual professor vary greatly in number, quality, and currency. Even in the rare cases where a professor has hundreds of recent ratings, the score may represent the views of only a small percentage of that professor’s students. This means that getting a representative sample is highly unlikely. Unless the website’s managers institute rules and restrictions to ensure the legitimacy of the writer and the size of the sample, the RMP ratings will continue to be untrustworthy.

523

3

Evidence: Second point in support of thesis

The second question to ask is, “Who controls RMP’s content?” Although RMP posts “Site Guidelines” with a “Do” list and a “Do Not” list, these lists are merely suggestions. The RMP Site Moderation Team will remove obscene or unlawful posts, but it has no way to enforce other guidelines. For instance, one of the items on the “Do Not” list asks users not to “post a rating if you have not taken a class with the professor” (“Site Guidelines”). However, to sign up for an RMP account, a user does not have to identify his or her university or list the courses he or she has taken. The site asks only for a name, a birth date, and the right to share the user’s personal information with its partner companies. This last question is a reminder that RMP is ultimately a commercial venture. The site is not owned by students or by their universities; it is owned by mtvU, a TV network that in turn is owned by media giant Viacom. The fact that each page of RMP content is surrounded on three sides by advertisements reminds users that the primary purpose of this site is to make money. When that fact is combined with the fact that the company has “the right to review, monitor, edit and/or screen any content you post,” it indicates that RMP does not warrant students’ trust (“Terms of Use”). A for-profit corporation, not the student reviewers, controls all of the information on the site and may modify content to increase traffic and impress advertisers.

524

4

Evidence: Third point in support of thesis

The last question to ask is, “Does RMP offer students the right kind of information—and enough in-depth information to give them a comprehensive understanding of a professor’s effectiveness as a teacher?” In fact, the site offers ratings in only four categories: “Helpfulness,” “Clarity,” “Easiness,” and “Hotness.” As one highly rated professor points out, “None of the dimensions [of RMP’s rating system] directly addresses how much students felt they learned” (qtd. in Arden). Moreover, no category addresses the professor’s knowledge of the subject matter. The ratings tend to focus attention on superficial qualities rather than on substance, apparently assuming that most students are looking for “easy A” classes taught by attractive, pleasant instructors. For students who are trying to make informed decisions about which classes to take, these criteria are inadequate. As one frustrated student user explains, “One of my professors had a really negative rating and comments, but he came to be one of my favorites…. his way of teaching matched me perfectly” (qtd. in Ross). The focus of RateMyProfessors.com is not on giving substantial feedback about teaching effectiveness or information about the educational value of a class. Perhaps these kinds of feedback do not attract advertisers; feedback about a professor’s “Hotness”—the least important measure of effectiveness—apparently does.

5

Refutation of opposing arguments

Students who argue that RMP is a “useful resource” say that the site helps them decide which professors to take and which to avoid (Davis). For example, one community college student says that checking professors’ scores on RMP “helps me choose a professor who will suit my needs” (qtd. in Davis). Committed RMP users also say that they are able to sift through the superficial comments and find useful information about professors’ teaching styles. As one junior at Baruch College in New York City says, “It’s all about perspective, and you need to be aware of this when you use the site” (qtd. in Ross). Users claim that they can read reviews and understand that “the same course materials may work really well with one group of students and less well with another” (McGrath) and that “even though [a particular student] doesn’t seem to like the professor, it sounds like I might” (qtd. in Davis). This ability to read between the lines, however, does not change the fact that the information on RMP is neither verifiable nor comprehensive. RMP’s reviews are anonymous, and some of them are almost certainly not written by students who have taken the professors’ classes. Professors’ “Overall Quality” scores, which so many students rely on, are based on ratings by these untrustworthy reviewers. Furthermore, these “overall” ratings are based on only two factors: “Helpfulness” and “Clarity” (“Rating Categories”). A rating that is calculated on the basis of very limited information from questionable sources can hardly be a “useful resource.” On balance, then, RMP does not give students the information they need to make informed decisions.

525

6

Concluding statement

On RMP’s homepage, the site managers encourage visitors to “join the fun!” (“About RateMyProfessors.com”). “Fun” is ultimately all users can hope to find at RMP. As Virginia Heffernan recommends, “Read it like a novel, watch it like MTV, study it like sociology. Just don’t base any real decisions on it.” Real students’ honest and thorough reviews of professors are invaluable, but sites like RMP do not provide this kind of helpful feedback. When deciding between a commercial website and old-fashioned word of mouth, anyone who thinks that RMP offers more useful information should keep in mind who writes and controls the site’s content. Because visitors to the site know almost nothing about the reviewers, they cannot know if their comments and ratings are trustworthy. Moreover, because they do know something about the site’s owners, they should know enough to be wary of their motives. If students are looking for useful advice about which classes to take, they should look no further than their own campuses.

Works Cited

“About RateMyProfessors.com.” Rate My Professors, MTV Networks, 2011, ratemyprofessors.com.

Arden, Patrick. “Rate My Professors Has Some Academics Up in Arms.” Village Voice, 26 Oct. 2011, www.villagevoice.com/arts/rate-my-professors-has-some-academics-up-in-arms-7165156.

Davis, Mandi. “Rate My Professor Gains Popularity with MCCC Students.” Agora, Monroe County Community College, 7 Dec. 2011, www.mcccagora.com/news/view.php/509769/Rate-My-Professor-gains-popularity-with-.

Heffernan, Virginia. “The Prof Stuff.” The New York Times, 11 Mar. 2010, www.nytimes.com/2010/03/14/magazine/14FOB-medium-t.html?_r=0.

McGrath, James F. “When My Son Discovered RateMyProfessors.com.” Inside Higher Ed, 15 June 2015, www.insidehighered.com/views/2015/06/15/essay-about-professor-who-learns-his-son-has-discovered-ratemyprofessorscom.

526

“Rating Categories.” Rate My Professors, MTV Networks, 2011, ratemyprofessors.com.

Ross, Terrance. “Professor Evaluation Website Receives Mixed Reviews.” Ticker, Baruch College, City U of New York, 12 Sept. 2011, ticker.baruchconnect.com/article/professor-evaluation-website-receives-mixed-reviews/.

“Site Guidelines.” Rate My Professors, MTV Networks, 20 June 2011, ratemyprofessors.com.

“Terms of Use.” Rate My Professors, MTV Networks, 20 June 2011, ratemyprofessors.com.

GRAMMAR IN CONTEXT

Comparatives and Superlatives

When you write an evaluation argument, you make judgments, and these judgments often call for comparative analysis—for example, arguing that one thing is better than another or the best of its kind.

When you compare two items or qualities, you use a comparative form: bigger, better, more interesting, less realistic. When you compare three or more items or qualities, you use a superlative form: the biggest, the best, the most interesting, the least realistic. Be careful to use these forms appropriately.

  • Do not use the comparative when you are comparing more than two things.

    INCORRECT Perhaps these kinds of feedback do not attract advertisers; comments about a professor’s “Hotness”—the less important measure of effectiveness—apparently does.
    CORRECT Perhaps these kinds of feedback do not attract advertisers; comments about a professor’s “Hotness”—the least important measure of effectiveness—apparently does.
  • Do not use the superlative when you are comparing only two things.

    INCORRECT When deciding between a commercial website and old-fashioned word of mouth, anyone who thinks that RMP offers the most useful information should keep in mind who writes and controls the site’s content.
    CORRECT When deciding between a commercial website and old-fashioned word of mouth, anyone who thinks that RMP offers more useful information should keep in mind who writes and controls the site’s content.