A Brief History of College in America
Published March 2018, updated Sept 2019
To understand “how we got here” on the de facto mandate of a college degree for most professional jobs, some historical review is helpful. As it turns out, the “college experience” is not and never was “market-driven” or “grass-roots” or anything of the sort.
Like any unrivaled monopoly system—as college has become over the last half-century—cost has exploded while quality (further detailed in the next section) has declined to a point where students are often subjected to the most passive (least effective) learning methods and immersed in a culture of “entitlement” attitudes and other adverse conditions. While some flaws were present from the start, the effects of pervasive institutional controls have been compounded more recently by outside interferences and a lack of market alternatives. This combination of factors took some time to fully develop, but the results are becoming harder to ignore.
The Origins of the University Model
The “university” model dates back to the 11th and 12th Centuries A.D., with the earliest institutions established in Bologna, Italy (1088), Paris (1150), and Oxford (1167). That was before the Printing Press, when hand-written manuscripts were rare and knowledge was centralized among political and religious elites. The university model primarily consisted of a “learned scholar” lecturing an audience of students who were expected to write, largely memorize and reiterate the given subjects to demonstrate their merit of being granted a “degree.” And institutional control—for better or worse—over the degree granting process was critical.
Initially there was some effort to protect academic freedom according to Wikipedia, with a 12th Century charter that “guaranteed [sic] the right of a traveling scholar to unhindered passage in the interests of education.” Since universities were both non-vocational (non-market) and politically influenced from the start, that much-advertised “academic freedom” would not last long. The good idea of a more-independent “traveling scholar” eventually devolved into a permanent teaching residence with bureaucratic “tenure” protections that have led to rigid orthodoxies and insulation from consumer feedback. (Wikipedia provides some valuable highlights but misses the point on tenure abuse. They also overlook the origin of the “traveling scholar,” which was a Greco-Roman and early Church concept—one that looks to be making a resurgence of sorts thanks to the less-restrictive nature of the internet.)
Regarding the original university model, the lecturing part made sense at the time, since the scholars owned nearly all the books. But the invention of the Printing Press (c. 1450) changed everything. Gradually over the next few centuries, books became cheaper and more readily available. So the need for an “expert” delivering lectures to a classroom tapered off considerably.
Two Centuries of Colonial/American Growth and Good Education via Private Methods
With books becoming more accessible to the general public, literacy rates in America were over 90% for whites (both men and women) by 1810. That was a time when U.S. population was rapidly growing, the economy was advancing and parents, employers and local communities were more actively engaged in education. Yet virtually no one went to college. And there was minimal state involvement and no federal interference whatsoever in schooling. State-wide compulsory “free” schooling for elementary though high school began in Massachusetts in 1852, thanks to a small group of political agitators and against the will of the public. (This seems to indicate that progress can be made without waiting for a “solution” from the government.)
Meanwhile, over in higher education, college popularity was extremely low for the first 200 years of Europeans arriving here after 1607. One concise history of college in the colonies and the early American years notes:
By the time of the Revolutionary War, there were nine colleges in the states. Enrollment up to this point was still quite small (rarely ever exceeding 100 students per graduating class), but those who did attend college became community and political leaders. … [as of the early 1800s] The number of colleges in America doubled in the previous quarter-century to around 20 institutions.
These small institutions (at the time) primarily catered to aspiring politicians, the idle rich and some state-subsidized religious groups. The latter category included the first colonial university (Harvard, founded in 1636, aided by “more than 100 different subsidies” by the end of the 1700s) and all eight other colleges existing in America as of 1776.
Early Roots of College Privilege
With the exception of Rutgers, which became New Jersey’s second college in 1766, the eight other prestigious colonial-era schools in Virginia (William and Mary, founded in 1693) and the northeast (Harvard, Yale, Princeton, Columbia, U. of Pennsylvania, Brown, and Dartmouth) were protected by a state-wide monopoly “charter” from the English Crown. This monopoly status was “jealously guarded” to “perpetuate an elite class” and remained in effect from 1636 to 1793 for Harvard and from 1746 to 1766 for Princeton. Near-monopoly status (except for some small, mostly private institutions) was enjoyed in those states until at least 1819 for the other schools, when a Supreme Court ruling is credited (by some) for opening the path for more competition. Outside of Virginia and the northeast, other states had already allowed limited flexibility for licensing colleges, but those were still few and small.
Most (if not all) of the nine colonial colleges were also given generous land grants and were exempted from paying property taxes. These early benefits to state-connected schools (which have always been incubators for politicians) would set a precedent of attaching college education with government favor, an ambition that has persisted to this day. (These two paragraphs are primarily based on Daniel Bennett’s research published in The Independent Review, Spring 2014. Key dates were verified in Wikipedia.)
While the university model in America was slowly gaining in popularity by the 1850s, college enrollment was still below 2% among white men ages 16 to 25, and nearly zero for blacks and women (The Independent Review, page 515). In comparison, college enrollment today is around 40% for a similar age group. While some of that enrollment rise could arguably be attributed to increasing economic complexity, a counter-argument seems more valid. Early college education in America and Europe focused on theology and other liberal arts. All of those subjects (and much more) can easily be studied at little or no cost online now, from a variety of openly competing viewpoints and market-based quality measures.
Furthermore, the highest-technology jobs in the free market (e.g., computer science) are the LEAST likely to require a college degree. Google and Apple don’t require one. Silicon Valley has plenty of non-college employees working on the next start-up venture thanks to some innovative “boot-camp” teaching and Apprentice companies. (Although I don’t exactly agree with some aspects of the “boot-camp” model, the fact that students are attracted to ANY alternative to college is promising.)
The Rise of Outside Subsidies and Pressures since the American Revolution
The authoritarian nature of the university model will always have its share of elite and political supporters, for reasons that are probably obvious. But the popularity of the university model among the middle-class and working poor in America can largely be attributed to non-market subsidies and pressures that make the college option seem more attractive or even “mandatory.” This is just another example of the economic pattern of “when you subsidize something, you get more of it.” In this case, add in decades of licensing restrictions, HR policy requirements and no small amount of political hype.
While the subsidies and pressures began slowly once America gained its independence from England, they are now approaching a dangerous “high water” mark (as measured by skyrocketing costs and debt, not to mention quality concerns). Some of the major federal involvements into college education are summarized below:
USDA Land-Grant Colleges (1862, 1890) – created about 70 colleges with an emphasis on agricultural science and military tactics, along with the “classical” liberal arts.
Student Army Training Corps (1918) – this predecessor to ROTC was “established at 528 [existing] colleges and universities across the nation” during World War I. By Wikipedia’s count, the SATC “was located on 525 educational institutions and inducted 200,000 total students on the first day.”
G.I. Bill (1944) – this mammoth federal program attracted over 2 million returning World War II veterans into colleges and universities with offers of free tuition plus a stipend for living expenses. It also “transformed American perceptions about college” and accelerated “future government involvement in the American university system.”
Great Society, Part 1 (April 1965) – the Elementary and Secondary Education Act (ESEA) led to fossilizing educational standards and administrative micro-management, federalizing what had previously been the jurisdiction of parents and local schools. It also had a side-effect of transforming high school into little more than college prep and pushing more students towards the diploma route. Even the supportive Wikipedia calls the ESEA “the most far-reaching federal legislation affecting education ever passed by the United States Congress.” This law has been reauthorized with new names attached roughly every five years (e.g., the No Child Left Behind Act of 2001 and the Every Student Succeeds Act of 2015).
Great Society, Part 2 (November 1965) – the Higher Education Act of 1965 (HEA) promoted the idea of college as a human right, opening the door to more federal intrusion into educational matters and a corresponding rise of pro-government PC intolerance. Among the many features of this law was Title IV, which created a federal authority to dispense financial “assistance” to students for any reason whatsoever (sans the military rationale of the past). This foray into personal career decisions—unimaginable even during the manic days of the 1930’s New Deal—led to the explosion of college debt, which was only $64 billion as of 1985 and now stands at nearly $1,600 billion.
After boosting demand, adding administrative burdens and causing prices to rise, politicians (egged on by a host of special interest groups) have since moved in to further “help” the situation with even more grants and loans that just keep people stuck in the current system. Meanwhile, tuition costs and college debt continue to climb, and quality—as measured in a recent major study and observed in frequent anti-social public outbursts—is on the steady decline.
Corporate Policies and Barriers
Soon after that artificial demand for college was stimulated and worked through the economy, corporate Human Resources departments began making a college degree a prerequisite to more and more professional jobs. Various factors can explain why owners and office managers authorized such requirements to be written in the first place. Based on personal experience in business and government environments, it is always easier to write a “tough” hiring policy on educational qualifications than to take direct responsibility for training junior staff. In recent years, the policy mandate of a college degree has become little more than a “check box” to brush over during the interview process. But corporate mischief in education didn’t start there.
These HR policies are in many ways a continuation of the mandatory licensing rules that professional trade associations (like this large medical group, the legal guild and the education lobby) have bought and badgered politicians to enact in prior decades. Behind the pervasive advertising on “quality” concerns, all licensing mandates act as a selective economic blockade that establishes a wall of separation between consumers and “unapproved” vendors (i.e., merchants who didn’t pay sufficient tribute to the authorities and interest groups who make the rules). Licensing mandates adversely affect quality by creating a monopoly guild approach that minimizes competition in order to maximize profits.
In the case of higher education, licensing restrictions have forced all doctors, lawyers, and teachers, and some engineers, accountants and others into “needing” a college degree in the first place, adding an aura of prestige to the university system via its well-paid alumni. Once again, that college requirement did not come from consumer demand.
Not to be confused with many partisan claims of “privilege” tossed around lately, the longstanding practice of enforcing state licensing barriers acts as an official policy of intense “institutional privilege.” In the case of college, it initially favored—but now damages—the institutions and careers that continue to go along with the special arrangement. For example, consider the voluminous medical “red tape” faced by state-licensed doctors and the dangerous position of the subsidized healthcare profession due to the growing “bubble” of medical costs.
Success without Subsidies: The Apprentice Model 2.0
On the other hand, it’s important to keep in mind that the Apprentice Model worked well for numerous professions (including engineers, doctors and lawyers) for centuries without subsidies, as noted in the 1921 book American Apprenticeship and Industrial Education page 17 and other modern references cited here.
In the legal profession, a good example of apprenticeship involves James Byrnes, U.S. Senator from South Carolina (1931-1941). In the biography for Mr. Byrnes, Wikipedia notes that he quit formal schooling at age 14 “to work in a law office, and became a court stenographer. …He later apprenticed to a lawyer – a not uncommon practice then – read for the law, and was admitted to the bar in 1903 [at age 21].” As a respected voice of moderation during the frenzied New Deal era, Mr. Byrnes (with no college whatsoever) was appointed by FDR to the U.S. Supreme Court in 1941.
The Apprentice Model got crowded out by an induced surge of the University Model in the 20th Century and the professional licensing cartels they helped foster, precisely because of political subsidies. The Mentor Model is basically an update of the long-proven Apprentice Model, with the benefits of modern technology and some added safeguards.
I don’t mean any slight to the old Apprentice Model, which was appropriate for a time when companies tended to be small and mentors were often owners. The new “safeguards” are mainly to account for the modern economy where administrators and mentors are usually non-owners. So “ownership-like” incentives and agreements are needed to keep things running smoothly.
Summary and Conclusions
The current system of college education in the U.S. rests on an unstable foundation of political subsidies and official policies of privilege. It’s pretty much been that way since the first college was established in colonial America in 1636, and the subsidies and privileges and associated restrictions have gradually intensified over the last century.
The attitude of “we’re special, we deserve special treatment” is hardly unique to academia, but it can safely be recognized as one of its original and ongoing core principles. And an entrenched $600+ billion per year industry can create enough influence to resist change until it weakens and deforms from the weight of its own restraints. An abundance of evidence suggests that the overconfidence of privilege and being shielded from competition have led to a host of inefficiencies and “blind spots” that have precluded any positive innovations in college education for too long. And no one should expect to find progress from sources who have a vested interest in defending the status quo.
Like so much of modern academia, the “key nuggets” of value discussed above are dispersed amongst entire library shelves of books and journals filled with tiresome minutiae and distracting fluff. Following a more common-sense approach—aided by mentor training, independent scholars and internet access—this review of college history attempts to survey the wide range of options available, determine what really matters to the paying customer and condense that in a presentable format. Hopefully, I have achieved that goal to the satisfaction of readers and potential partners.
My conclusions from this condensed history of American college education are:
Very little in modern college is “consumer-driven.”
The college approach is not “natural” (it exists largely due to outside subsidies and political pressures).
It is not “sustainable” (e.g., $1.6 trillion in college debt).
Many people would choose more mentoring / less college if given the option, since that was the “natural” choice before subsidies crowded it out.
Those subsidies and the associated high price of college simply cannot keep cruising on auto-pilot. In the very near future (perhaps right now) millions of students will be looking for a better way out.
The next section provides a glimpse of the modern “College Experience” that you will largely avoid if you decide to use the Mentor Model or similar apprenticeship programs.
For further information please contact: Steve@mentor-model.com