What makes a college a college? These days, anyone with a decent computer can design a fancy seal with a phrase like “Ever to Excel” or “The Way, the Truth, the Light” on it in Greek or Latin and start printing diplomas. And with the advent of online education, a college can flourish without an ivy-covered campus—or any campus at all. If the traditional marks of legitimacy—a campus, a library, dozens of students wandering around in “University of” sweatshirts—are not reliable, students need another symbol of whether an institution is providing a quality educational experience. And the federal government needs a way to distinguish between universities that will be good stewards of financial aid dollars and those that will not.
In the past, students, policymakers, and institutions relied on accreditation to distinguish good colleges from bad ones. Accrediting agencies devise standards for colleges and employ a system of self-study and peer review to determine whether educational institutions are worthy of their seal of approval. The tradition of accreditation was so established as a measure of quality that Congress adopted accreditation as a precondition for institutions seeking to participate in federal student aid programs.
Today, however, accreditation is not so revered. Since the 1992 amendments to the Higher Education Act there have been rumblings in the policy world that accreditors are not the custodians of college quality they were thought to be. And the recent attention to accredited for-profit colleges brings the weight of evidence to that assertion, with figures like a 9 percent graduation rate for bachelor’s degree programs at University of Phoenix and a 69 percent withdrawal rate at Kaplan colleges.
If the accreditation system is not broken then how can such institutions still be operating with accredited status and the access to federal dollars that accompanies it?
For more on this topic, please see: