International parents and students considering an undergraduate education in the U.S. frequently consult one or more of the big three ranking publications ― the Shanghai, QS World and Times Higher Education World University rankings reports.
Emphasizing research publication productivity and the accompanying reputation, these reports tend to filter out all but the top tier research institutions in any country. With hundreds of high quality yet sub-top tier institutions in the United States, these premier reports are of limited utility to international parents and students comparing U.S. institutions. U.S. News and World Report has been considered a source of reliable and relevant consumer information to fill the gap.
Since its initial ranking report in the mid 1980s, USN&WR has expanded its portfolio of reports to include a selection of discipline and professional based rankings, as well as an array of regional reports. In the United States it has retained its dominance as a more relevant source of comparative institutional information. It is said that within hours of its annual autumn release, visits to its website jump into the millions. One can safely presume that a good number of international visitors are among them.
USN&WR’s appeal to parents and students appears to be the inclusion of more relevant consumer information in its mix of ranking metrics. While the big three and many of the other ranking systems that have sprung up in the last 30 years gage institutional quality by a mix of metrics frequently calibrated to the generation of new knowledge and subsequent impact, USN&WR appears to focus on a mix of metrics more directly reflecting the quality of an institution’s academic programming and its graduates. Well over half of the available 100 points are allotted to admission rates, student-faculty ratio, freshman retentions, and graduate rates.
New knowledge generation weighted ranking systems tend to rely on one or more independent third parties for their metrics. For example, Thomson Reuters’ Social Sciences Citation Index tracks publications in 2,474 major social sciences journals across 50 academic disciplines. Thus an institution’s faculty’s research productivity and impact is affirmed by an impartial third party source. USN&WR rankings are substantially based on self-reported data provided by each institution. Thus, the validity of the data could raise concern.
A recent New York Times article entitled “Gaming the College Rankings” identifies a handful of relatively well known undergraduate programs that have been found to or acknowledged “twisting the meanings of rules, cherry-picking data or just lying.” Robert Morse, USN&WR’s director of research, is quoted that Claremont McKenna College is “the highest-ranking school … to admit to misreporting.” The college has acknowledged that a high ranking officer had inflated the average SAT scores given to USN&WR over the last six years.
Gaming has also been found below nationally ranked institutions. The New York Times further reports that if Iona College’s, a small institution in a New York City suburb, ranked 30th among the northeast’s regional universities was reviewed against corrected data it would have dropped to 50th.
Professional schools have also been identified as gamers. In recent years, two law schools, Villanova University and the University of Illinois have admitted misreporting statistics. The same New York Times article reports that Villanova conceded that its deception was intentional and Illinois did not acknowledge misrepresentation. A soon-to-be-released book “Failing Law Schools” by Brian Tamanaha, a former law school dean leads to a similar conclusion. He describes a number of questionable ways a school can attempt to advance its rank standing. Among the tactics gaming institution employed Tamanaha cites selectively reporting admission test results to pump up its selectivity image, hiring its own graduates on short term contracts to inflate its employment statistics and selectively reporting starting salaries.
There is, however, no reason to mistrust the USN&WR rankings because they partially rely on self-reported data. The publication does crosscheck self-reported data against other public sources. Further it adjusts its metrics and seeks to close loopholes on a continuing basis. The vast majority of reporting institutions do play by the rules and report accurate data. Still, international parents and potential students may want to consult other public comparison sources.
By William Patrick Leonard
William Patrick Leonard is vice dean of SolBridge International School of Business in Daejeon. ― Ed.
Emphasizing research publication productivity and the accompanying reputation, these reports tend to filter out all but the top tier research institutions in any country. With hundreds of high quality yet sub-top tier institutions in the United States, these premier reports are of limited utility to international parents and students comparing U.S. institutions. U.S. News and World Report has been considered a source of reliable and relevant consumer information to fill the gap.
Since its initial ranking report in the mid 1980s, USN&WR has expanded its portfolio of reports to include a selection of discipline and professional based rankings, as well as an array of regional reports. In the United States it has retained its dominance as a more relevant source of comparative institutional information. It is said that within hours of its annual autumn release, visits to its website jump into the millions. One can safely presume that a good number of international visitors are among them.
USN&WR’s appeal to parents and students appears to be the inclusion of more relevant consumer information in its mix of ranking metrics. While the big three and many of the other ranking systems that have sprung up in the last 30 years gage institutional quality by a mix of metrics frequently calibrated to the generation of new knowledge and subsequent impact, USN&WR appears to focus on a mix of metrics more directly reflecting the quality of an institution’s academic programming and its graduates. Well over half of the available 100 points are allotted to admission rates, student-faculty ratio, freshman retentions, and graduate rates.
New knowledge generation weighted ranking systems tend to rely on one or more independent third parties for their metrics. For example, Thomson Reuters’ Social Sciences Citation Index tracks publications in 2,474 major social sciences journals across 50 academic disciplines. Thus an institution’s faculty’s research productivity and impact is affirmed by an impartial third party source. USN&WR rankings are substantially based on self-reported data provided by each institution. Thus, the validity of the data could raise concern.
A recent New York Times article entitled “Gaming the College Rankings” identifies a handful of relatively well known undergraduate programs that have been found to or acknowledged “twisting the meanings of rules, cherry-picking data or just lying.” Robert Morse, USN&WR’s director of research, is quoted that Claremont McKenna College is “the highest-ranking school … to admit to misreporting.” The college has acknowledged that a high ranking officer had inflated the average SAT scores given to USN&WR over the last six years.
Gaming has also been found below nationally ranked institutions. The New York Times further reports that if Iona College’s, a small institution in a New York City suburb, ranked 30th among the northeast’s regional universities was reviewed against corrected data it would have dropped to 50th.
Professional schools have also been identified as gamers. In recent years, two law schools, Villanova University and the University of Illinois have admitted misreporting statistics. The same New York Times article reports that Villanova conceded that its deception was intentional and Illinois did not acknowledge misrepresentation. A soon-to-be-released book “Failing Law Schools” by Brian Tamanaha, a former law school dean leads to a similar conclusion. He describes a number of questionable ways a school can attempt to advance its rank standing. Among the tactics gaming institution employed Tamanaha cites selectively reporting admission test results to pump up its selectivity image, hiring its own graduates on short term contracts to inflate its employment statistics and selectively reporting starting salaries.
There is, however, no reason to mistrust the USN&WR rankings because they partially rely on self-reported data. The publication does crosscheck self-reported data against other public sources. Further it adjusts its metrics and seeks to close loopholes on a continuing basis. The vast majority of reporting institutions do play by the rules and report accurate data. Still, international parents and potential students may want to consult other public comparison sources.
By William Patrick Leonard
William Patrick Leonard is vice dean of SolBridge International School of Business in Daejeon. ― Ed.