College Alternatives, Part 2: Moving Out of 

the Educational Middle Ages

Everyone knows about the skyrocketing cost of college. We should also look at the antiquated and ineffective teaching methods they rely on—and develop a better alternative.

Published June 2018, updated Sept 2019

According to most historians, the European Middle Ages were a period beginning in the 5th century AD with the Fall of Rome and extending for the next 8 or 9 hundred years. For the first few “dark” centuries of this period, education and art collapsed, skilled trades largely vanished, science was non-existent, and the condition of life was both squalid and short.


What little remained in the field of education was provided via small, somewhat independent monasteries. Then in the 11th and 12th centuries, authorities in Italy and England would be first to expand and alter the educational system by wedding it to state privilege and attaching top-down controls most prominently featured in the “degree” granting process. This new creation, known as the university system, is remarkably similar to what Western culture has re-adopted during the last century.


The invention of the Printing Press (c. 1450) challenged the influence of the initial university system, as books became cheaper and more readily available over the next few centuries. So the need for a university “scholar” delivering lectures to a classroom tapered off considerably.


But the ruling authorities in the 19th and 20th centuries were not finished with their tinkering in higher education. Once again, politicians and their academic surrogates could not resist the folly of European Statism and the promise of social engineering. Starting slowly during the Civil War, then steadily expanding during World Wars I and II and other subsequent legislation, federal authorities financed and re-launched the university model in America.


Despite throwing trillions of tax dollars down that memory hole of ancient folklore, the $600+ billion per year university industry struggles. It can’t attract students without major handouts from Washington and the State capitols. And the system suffers badly from institutional intolerance and student apathy that become harder to ignore every year.


The inherent shortcomings of that business model will never be fixed from within because the system is unworkable to begin with.


History has shown us a better way. Technology offers a cheaper alternative. The subsidies are inherently corrupting to academic independence and will eventually run out. And the crushing debt of the university system is too ruinous.

Overdue for a Second Opinion


In case you think that students might learn something useful while percolating in a passive classroom set within a giant complex of aging buildings… perhaps you should get a second opinion on that manufactured assumption.


In Part 1, I discussed how the well-crafted sales pitch of “You Need a Degree” profits a lot of debt-dealers, scholarship peddlers and college administrators that are eager to entice kids into making long-term financial commitments. Today, we’ll take a closer look on the quality side along with options to the “classroom only” approach.


The point that some college professors passionately believe in their highly controlled system means little, except that many teachers are too deeply invested in their rank and privileges to objectively consider alternatives. In the end, anyone’s alleged motives don’t really matter much. Results matter more. And positive results tend to indicate good motives.


Whether you pursue liberal arts, business or the “hard” sciences (or all three), there is remarkably little that the non-challenging, hyper-legalistic and increasingly intolerant setting of modern college can do to broaden the mind and enrich the soul. I would argue that the opposite effect is more likely, based on our passive-explosive culture of college indoctrination and mass-conformity.

The Problem with Conformity


The initial problem with conformity—which has been nurtured by college-educated leaders and opinion makers in all walks of life—is that it creates a false sense of security, as in “everyone’s doing it, so it must be right.” Actually, the opposite is usually true: if everyone’s doing it, nobody is really thinking about it and it’s probably wrong. Excessive conformity also leads to intolerance and hostility (we’re way past that already) along with cynically rejecting any new ideas regardless of how bad the current system is falling apart.


If you’re not convinced that paralyzing conformity/intolerance/hostility/cynicism are a big deal in the U.S., consider Exhibit A: everything about college; Exhibit B: skyrocketing healthcare costs, with its safety blanket of insurance coercion; Exhibit C: national debt; Exhibit D: obesity crisis; Exhibit E: giving free junk food to millions of overweight people despite items B, C and D. This may be going out on a limb (by business community standards), but some folks would impute a fair share of “conformity” to explain the ongoing support for the social insecurity Ponzi, the failed war on certain drugs along with the failed wars of global conquest.


The crazy thing is that some conformity-prone adults will read the above paragraph and delude themselves into believing that any slight deviation from protocol is “way too radical for me!”  then send their kids off to the anti-social training camps of passive vegetation, speech codes, thought-crime enforcement and political zealots who openly call for the forced overhaul of modern civilization. An even more crazy phenomenon is that college “radicals” and their cheering squad in major media typically support or remain silent on 100% of the above destructive core-conformity while attempting to antagonize people with “identity politics” and other superficial distractions. The insanity has gotten that bad.


In short, conventional thinking on college (and lots more) doesn’t involve much thinking. It involves a deep fear of what might happen if we stop following the crowd. The submissive setting of college lectures (in-person or online) merely feeds into that conformist mindset of narrowly scripted subjects and opinions that always trends towards the lowest common denominators. Regardless of our flaws in the past, conformity was not nearly the problem a century ago as it is today, after the rise of widespread college-style education.

Moving Education Out of the Classroom


More specific to college teaching methods, the exhaustive offering of textbook formulas and boring lectures probably doesn’t enhance anyone’s “education” at all. And this conclusion is anything but “new.” It’s almost like the last three millennia of recorded history have been erased and overdubbed with Teletubbies and Wiggles reruns. Great stuff for toddlers learning primary colors and phonics. But a little rudimentary for young adults.


Around 3,000 years ago, the wise King Solomon gave us the saying “iron sharpens iron,” a phrase so valid and concise that it’s still used today. Less well known is the full sentence from which that Proverb is extracted: “As iron sharpens iron, so one man sharpens another.”


The focus here is on the dynamic interaction involved. Or “sparks” in the sense of two swords striking each other in a battled of ideas. Socrates, the Greek stonemason- and soldier-turned-philosopher (c. 470-399 BC) latched onto a similar concept, using penetrating questions and sound reasoning via dialogue (not contrived “debate” spectacles) in the pursuit of virtue and wisdom. He never wrote a book, yet managed to influence civilization for centuries after his death.


The dull format of passive books and lectures—the methods of choice for pandering college professors as well as many subsidized leaders of institutionalized religion—merely “puffs up” the brain with empty knowledge. I mention the latter group because state-subsidized religion (often confused with independent churches or temples) was complicit in the original universities in the 11th and 12th centuries, all nine of the colonial-era colleges (Harvard, Yale, Princeton, etc.) and hundreds of scholastic institutions to this day.

The university system has always involved a high level of political privilege, corporate indoctrination and top-down control over the entire process.


This is not meant to imply that every aspect of the university model is false and detrimental (the attention to broader arts and sciences can be beneficial, for instance). It’s just that privilege, indoctrination and anti-market controls have always been key ingredients to that system—now more than ever. Yet both insiders and critics consistently fail to acknowledge that.


Based on today’s culture of permanent outrage and knee-jerk protests from one side—almost always self-serving and self-destructive—met with reflexive mockery and appeasement from the other, it’s safe to say that few people are being “sharpened” in college classrooms or church pews, or from watching/listening to TV news or AM talk-radio. Personal growth from those impersonal methods is simply contrary to human nature.

The Limitations of Books and Lectures (and their online equivalents)


Books and lectures are fine for introductory purposes, technical references or for entertainment, particularly when compared to heavy doses of mind-numbing television. But they fall short when it comes to personally challenging anyone in any moral, ethical, spiritual or even “practical” sense. And we should forget about any hopes of political advancement from those top-down and impersonal methods, as recent history has abundantly shown.


Those weaknesses should be obvious in the case of textbooks—the bedrock of the accredited university model. The moment any written document (in the absence of an engaging teacher) tarnishes someone’s belief system or cherished idols—such as their favorite politician, cultural custom or sacred object—or pushes them away from a harmful addiction… the reader puts it down, convinced the writer “doesn’t know what he’s talking about” or “is a self-righteous jerk who doesn’t understand me” or some other lame excuse, because the reader (or student) is in total control. That aspect alone is deadly to any effective teaching method.


By design, books are an entirely controlled setting for both the writer and the reader vs. any more dynamic and effective teaching style, which involves a more balanced setting of give and take. Books are more of a dump and run approach; that’s why they generally suck for anything beyond introductory learning purposes.


When it comes to lectures, for the last three or four generations we’ve replaced independent scholars (now making a resurgence on the internet and a few face-to-face settings) with leashed experts who are usually on the payroll of some corporation, institution or government agency. With extremely few exceptions, any expert who is beholden to a government, business or religious corporation will eventually succumb to the pressures to support the official script no matter how corrupt... or else they will be fired.


With books and lectures, the celebrity “experts” who dominate those formats have too much to lose to risk challenging any of the more dangerous modern orthodoxies that are tearing the nation apart, particularly in the world of education. In that restrictive setting, conformity and control masked with superficial glitz will always trump stimulating discussion or (perish the thought) creative problem solving.


If you want to learn via reading, articles on the internet and a few independent magazines are usually much better at conveying useful information and analysis. But they have limitations too, since you can’t ask spontaneous questions that would benefit both the reader and the writer; there is rarely an attempt at “leading by example”; any flawed idea (when unchallenged) can sound good on paper with the right finesse; and most websites and magazines are ideologically factionalized and prone to pandering to the established preferences of the audience.


(It shouldn’t be surprising that decent analysis and “real news” can more often be found on the free-access, market-driven internet, far outside the Good Ole Boys club of ultra-exclusive FCC licensing, urban newspaper monopolies and their collection of other subsidized platforms. Their constant bias towards central planning and social engineering is important to remember, since you will never hear any of this from the gossip/slander/advertising industry that specializes in state-sponsored complacency.)


Just a century ago, before we were lured away from the Apprentice Model, the general concepts above would be considered Teaching 101. Today, it probably sounds like some radical new idea. (If millennials want to think that dynamic interaction is a NEWS FLASH, TRENDING NOW!!! … that’s fine with me.)


But one-way books, newspapers, lectures and their online equivalents—the most persuasive methods among a passive culture of induced conformity, and the methods most easily tampered with—have always been the tools of choice among creeping authoritarians and their paid support staff. Take this recent example from The Los Angeles Times.

In a roundabout acknowledgement of the inherent advantages of dialog vs. reading, in August 2018 the LA Times published an article on a university (i.e., government) funded study they admit “backfired” in trying to “bridge the partisan divide” by “harnessing the power of Twitter.” In common language, that means relying on one-way information snippets to subtly alter people’s viewpoints and spending habits… which happens to be the entire business model of newspapers and “social media.”


Researchers from Duke University, Brigham Young University and New York University were disappointed to admit “Attempts to introduce people to a broad range of opposing political views on a social media site such as Twitter might be not only ineffective but counterproductive.” Buried almost 200 words into the article was an abrupt admission:


They already knew people become more inclined to compromise on political issues when they spend time with people who hold opposing views. Face-to-face meetings can override negative stereotypes about our adversaries, paving the way for negotiation.


Wow, that’s sure nice to know! But why isn’t this phenomenon more prominently discussed, not only in this article, but throughout mass media? Why doesn’t public policy reflect this important finding (i.e., stop wasting so much tax money on expensive delivery systems for books and lectures)?


For starters, the book/newspaper/television/college lecture industries wield immense economic power and comprise (along with Hollywood) the dominant political force in the U.S. Admitting inherent weaknesses to those platforms probably doesn’t seems fathomable to people who’ve struggled their entire careers to ascend the ranks of those privileged professions—where market competition is thwarted as a norm.


In the case of LA Times, besides burying the most important observation and striving in vain to contradict it, even the article’s title was misleading. It stated: “Caught in a political echo chamber? Listening to the opposition can make partisanship even worse.” The sole photo caption again used the same distortion: “Listening to those who disagree with our political views is supposed to make us more open-minded. But an experiment conducted with Twitter found it actually made people more partisan.”


But the study was about READING, not “listening.” Once again, a major newspaper failed to grasp the obvious regarding the vital issues of effective communication and learning. The story ends with an obligatory reminder that the university scholars “said it’s too soon to give up on the idea that social media can help bridge the partisan divide.” A more appropriate assessment may be that, among both the vast majority of mass-media and the sheltered academics they routinely feature, some people are too entrenched in their own worlds to ever try a new approach.

Outside the college realm, the pattern is similar. Now that more effective teaching methods have been pushed aside for a few generations, their easily mass-produced (yet somehow more expensive) alternatives have moved in to satisfy the “hunger for knowledge” that remains. In an attempt to feed that desire, Americans purchased over 680 million printed books in 2017. Total annual revenue for the book-and-lecture workshops of American colleges was $649 billion (Fall 2016 — Spring 2017 school year). Total government subsidies from combined federal, state and local levels on all classroom-based education (K through college) for FY2019 will be over $1.1 trillion.


Oh, yes. We love our books and lectures and other one-way communication methods. And most detractors are too busy complaining (via books, lectures, etc.) to offer a better alternative.


Rounding to the nearest billion, I’m guessing that federal, state and local government spending on any type of dynamic two-way teaching or “mentoring” is approximately zero. Not that I’m pushing for outside interference; just gauging our priorities.

Open Classroom of Civilization: Dynamic Two-way Teaching


Since the days of Socrates and Jesus, not to equate those two excellent teachers, we’ve known that two-way dialog and direct application (i.e., mentors and apprenticeship) are better ways of learning. The fact that both of those challenging, never pandering, teachers left a legacy that is remembered about 2,400 years and 2,000 years later, respectively, is a testament to the quality and durability of their teaching. Not just the words they spoke, but their methods as well. (In comparison, most of the drivel pumped out of the modern college system is forgotten within hours of cramming for the final exam.)

Most forms of public education in Europe (beyond isolated monasteries) took an extended absence during the Dark Ages, from roughly the 6th to the 10th century AD. During that period, extreme poverty and political oppression ruled the Western world.

Gradually climbing out of that pit of misery, dynamic hands-on teaching was central to the re-birth of civilization from the late Middle Ages (starting around the 11th century) and enduring until the beginning of the 20th century. From an educational standpoint, master craftsmen in guilds or businesses would train young apprentices, with the junior staff gaining both social and technical skills along with being provided room and board. The owners would benefit from inexpensive labor that was not “free” or “easy to exploit,” since a competitive marketplace and parental involvement work to minimize the latter.

Anyone interested in beneficial “liberal arts” could consult elders in their local community or in some cases independent scholars (something available now more than ever). Even the folks at Wikipedia inadvertently acknowledge this former opportunity, highlighting a 12th century university charter that “guaranteed [sic] the right of a traveling scholar to unhindered passage in the interests of education” as nothing less than the foundation of “academic freedom.” (Then they stumble a bit. Wikipedia’s unequivocal praise of bureaucratic tenure “protections” that have led to rigid orthodoxies and insulation from consumer feedback is standard academic tripe. That self-congratulating view contradicts much available evidence and is probably based on a failure to recognize the anti-market nature of academia itself. This pro-tenure, anti-market preference is common throughout the university system, yet without comparison in the rest of society. Much of this disparity can be attributed to the echo chamber and collectivist mindset of the college bubble. When Statists see difficulty, their solution is usually more top-down conformity and control, with heavy doses of legalism thrown in to keep any subversives in line. Market corrections and consumer choice are window dressing at best. This bias has led us to the nullification of any meaningful sense of academic freedom.)


In contrast to the subsidized and divisive university model—which reaps over $500 billion annually in handouts, grants and loans as detailed in Part 3—the educational system of the Apprentice Model relied on mutual cooperation between students, teachers and consumers… as quaint as that may sound. Without subsidies, they had no other choice.


Dynamic, experienced and rational teachers never sought those privileges or accepted the debilitating restrictions attached. The Apprentice Model worked well for numerous professions (including engineers, doctors and lawyers) for centuries without subsidies, as noted in American Apprenticeship and Industrial Education page 17 and other modern references cited here. The Apprentice Model got crowded out by an induced surge of the university model in the 20th century, precisely because of political subsidies and artificial pressures, not because of any inherent shortcomings of professional mentoring.


The pre-Printing Press business model of the university system—which made some sense when it was devised in the 11th and 12th centuries, when hand-written manuscripts were expensive and rare—was largely irrelevant soon after America became liberated from British rule and its stifling culture of hereditary privilege. Although schools are loath to admit it, literacy rates in America were over 90% for whites (both men and women) by 1810. That was a time when U.S. population was rapidly growing, the economy was advancing and education was almost entirely a private enterprise.


For a more recent academic approach, the National Training Laboratories of Bethel, Maine confirmed a similar benefit for interactive teaching methods via research in the early 1960s. As visualized in their famous Learning Pyramid, lectures and reading provide the least retention; group discussion, practice by doing, and teaching others (or immediate use of leaning) provide the best retention.


Yet the worst methods of learning get all the attention from the distorted educational marketplace we’ve created.

Why Such an Imbalance?


Most professional jobs now require a minimum of 17 years (K through college) of classroom books and lectures. If you’ve “only” got 16.5 years of that stuff, you’ll be treated as a leper and your resume will be thrown in the trash. (Some innovative companies are changing that pattern, but they are currently an enlightened minority.)


From the businesses I’ve observed over 25 years as a consultant or worked for as an employee, I’ve never seen or heard of ONE that provides even 0.1 years of anything that could vaguely pass for “mentoring.” Senior staff don’t have the time and aren’t given the positive incentives to encourage that.


Putting historical evidence aside and looking at it another way, no one in their right mind would go to a doctor whose resume said: I’ve read lots of books and sat through exciting lectures in school… but I’ve never actually practiced any of this stuff. Nevertheless, employers hire college grads with similar all-theory/no-skills credentials.


Educational reformer and New York City Teacher of the Year for 1989, 1990 and 1991, John Taylor Gatto, has done pioneering research (summarized in his book and videos) on how education has evolved in America. One of his many apt conclusions is:


What’s gotten in the way of education in the United States is a theory of social engineering that says there is ONE RIGHT WAY to proceed with growing up.


My question to employers and their Human Resources departments is simple:  Who decided on this need for 17 years of one ineffective learning style vs. less than 0.1 years of other styles like mentoring?


My point is NOT that classroom books and lectures are totally worthless. I’m just saying that history and common sense suggest that adding in some direct mentoring can make the overall learning experience into a much better educational package. In other words, instead of a lecture vs. mentor imbalance of 17 to 0.1, why not maybe 15 to 2, or something similar? And it doesn’t necessarily have to add up to 17 years.

Building a Better Alternative


The outdated university model of books and lectures and their online equivalents are fine for introductory purposes and as reference manuals. Beyond that, direct application under the guidance of an experienced professional can have a far greater impact. Doctors figured that out long ago. They call it residency.


I’m calling it The Mentor Model. It’s basically an update of the Apprentice Model for professionals, with some added Safeguards to protect and balance the interests of students, mentors and companies. One of the more important parts is to attach positive incentives for the professional mentors, instead of the failed mix of fear/guilt/charity we resort to now, as in “just do it, because it’s part of your job.” From my experience, that will never work. The basis here is to recognize that in the modern economy, potential mentors are usually non-owners. So “ownership-like” incentives are needed to keep things running smoothly.


The plan is for students, after 1-2 years of college, to go right into professional work. Since community colleges are very affordable and still do some teaching—skipping out on the “publish or perish” routine and other distractions—that’s a good way to get some core classes finished and establish a bit of independence after high school. It also helps with screening serious candidates.


The idea with The Mentor Model is to replace the last 2 years of college with a 1 or 2 year blend (depending on career choice) of work and study that is more productive than doing a bunch of classroom theoretical drills.


This also addresses the key question: who pays for it? Answer: You do.


During that 1 or 2 year Transition Period, students (junior staff) get a lower pay rate than a full-time salary as a trade-off for receiving valuable mentoring and work skills from an experienced professional. During the Transition Period, the target is for a reasonable split of your time on academics vs. work. For an example 50/50 split, that would translate to 1,000 hours of online self-study and occasional field trips if appropriate (unpaid academics) and 1,000 hours of paid work per year.


In the case of consulting engineering (my background) or any job like IT or accounting that bills hourly fees, instead of charging out junior staff at $100 per hour as is typical for entry level work, the company would charge $60 per hour for this example. Right off the bat, the client is saving 40% on junior labor and the firm has an advantage for bidding new jobs. Working 1,000 hours per year at $60/hr yields $60K to split four ways between the company, the junior staff, the mentor and yours truly (who gets the smallest slice). For employees that don’t bill hourly to clients, the concept is similar with minor adjustments. Economic details beyond that are left for discussions with interested companies.


After 1-2 years in the Transition Period (as agreed in advance) junior staff would graduate to a full-time salary, having learned all the textbook/online theory necessary, gaining diverse and mentored work experience, while earning $10-15/hr pay and accruing no debt.


If a student wants to go back to school after some mentoring, that’s fine. Due to the financial practices of the modern education system, you’ll need to be aware of student loan and scholarship availability and debt repayment schedules if you intend to leave or re-enter the university model.


I should add: this is not an informal “internship” where students are often left doing a small list of mundane tasks with little or no supervision. If you’re thinking of participating in a summer internship, you may want to ask your prospective employer: Who will be my boss and how many hours per week of mentoring will I receive?  If you get a blank stare or weak excuses, that should tell you something. The internships I’ve seen are better than nothing… but just barely.

In summary, college is over-priced and over-rated. Americans figured that out a long time ago but seem to have forgotten it lately. There are much better and more cost-effective ways to learn. It comes down to educating yourself on the options and deciding which method is the best fit for your career aspirations.

Next: College Alternatives, Part 3:

Common Misconceptions & Obstacles to Progress

For further information please contact: