A Case Against Credentialism and Formal Education

Magnus Renegade
11 min readMay 28, 2022

--

Introduction

Credentialism is based on the notion that in order to participate in some domain of knowledge or activity, one must first demonstrate that they possess the information, skills, attitudes, and values necessary to perform it in line with that domain’s professional standards through the completion of a credential, some sort of written proof of certification in a specific area. Examples of credentials by this definition would be degrees, licenses, certificates, certifications, diplomas, etc. Most people in modern Western society have credentials, such as high school diplomas, driver’s licenses, and university-level degrees, and it is commonly held that such credentials are essential (or damn near close to it) for one to participate in the adult working world and society at large. Most jobs in the United States (Around ⅔) require a four-year college degree, and this number is expected to rise over time.

What are the problems with credentialism?

Well, in order to investigate this question, it is important to state the central argument credentialism is based on, namely, that it is necessary to prove knowledge before being allowed to enter a professional field. This premise is closely related to the central thesis that justifies grading systems and formal assessments such as examinations, that it is necessary or desirable to prove knowledge before the assessors can reasonably believe that you’ve learned something.

But is it necessary to actually prove that you know something before entering a professional field, and if so, are formal credentials really the best way to do it?

Let’s start the latter and assert that credentials usually aren’t the best way to demonstrate knowledge. Take college degrees. Most people attend liberal arts colleges and are expected to learn a variety of classes and contents that aren’t directly associated with their major. STEM majors need to take English classes usually. I’m expected to take an art class, even though I’m an engineering major, etc. Poets take biology 101. You get the idea. When one graduates with a four-year degree from one of these institutions, say a B.A in Poetry, does that mean they know most of what they studied in school? If I looked at their transcripts and saw all of the courses they took and received grades within, say they passed everything, could you reasonably say that that person knows those subjects as a college graduate?

It is impossible to know everything, even the comparatively smaller area of knowledge you have to work with in university, so most people simply forget most of the general education classes they take and don’t internalize any of the big ideas behind them. Science has a big idea about how we can go about knowing things, that through experimentation and repetition, we can find consistency in our highly varied subjective experiences of the world in order to accomplish certain results in reality. Do most people grasp that concept after taking 12 years of science classes and possibly 1–2 more in college (as a non-STEM major)? Do most people even really remember their high school education or college education when they enter the working world?

To graduate in the United States, you have to have a certain number of credits in English, foreign language, mathematics, science, social studies, etc. But do most people actually develop a masterful level in any of those? Do most Americans really learn a foreign language to fluency or conversational proficiency? Do most people really grasp that mathematics is a fiction we have developed to describe and understand the world and as a means of thinking through problems? Do most people come away with the interpretation that history is a nonlinear process with various parts of admirable and horrid qualities, and that there is something to learn in all time periods?

No, most people barely learn anything, and the takeaways they get tend to be basic knowledge that is so far removed from how the subject actually works professionally or for people who care about it. Most people view history as a linear progression from terrible to better, which is vastly over simplistic, and find past humans to be stupid compared to the more enlightened people of now. And even ignoring the big lessons people learn from formal history classes, people get the smaller facts completely wrong, such as that colonialism resulted in millions of deaths and suffering for billions, or that white people didn’t know that slavery was wrong in the past.

So, we have two streams of ignorance: big lesson knowledge and small facts knowledge. Big idea lessons are the central ideas, values, attitudes, or ways of thinking that certain domains promote and rely on, while small facts are focused on accepted statements of truth within those domains. An example is in the domain of mathematics, the big idea of calculus is that bu transforming input-output relationships (functions) into components (differentiation) and combining those components together (integration), in order to understand rates and accumulations of change, i.e. that calculus is the mathematical study of change. An example of small facts knowledge would be that the derivative of f(x) = x^2 is 2x.

Domains can become more specific too. We can focus on subdivisions of mathematics, like calculus, algebra, or arithmetic, or even more specialist regions like subtraction, quadratics, or area, and each would have its associated big lesson and small facts knowledge. As you learn any domain, becoming masterful at it is essentially the process of understanding the logic that underpins all of the small facts knowledge in a domain, which is the big lesson(s), or understanding the big lesson(s) first and then using it understand the associated small facts. So, it can be a top-down or bottom-up process, depending on the person and subject.

When people experience domains for the first time in school, they often develop a negative working model in that domain. Going with the math example, most people as a result of school mathematic classes, view the subject as boring, dull, useless, a waste of time, harmful, and complicated and incomprehensible (i.e. negative attitudes); develop false ideas about how math works, getting facts wrong and not grasping the big ideas, such as thinking 7x8=65 or thinking math is primarily calculation problems and not a reasoning system (i.e. negative beliefs, information, and skills); and overall, lose any and all interest in pursuing the subject further once they no longer are required to learn about it in school. They may even internalize that they can never learn math, and this self-belief of incompetence and disinterest can be transmitted generationally.

So, when you see a high school diploma, meaning you’ve completed about 12 years of mathematics, for 90%+ of people, that doesn’t represent that you have a basic understanding of math, or even an advanced one. It may, in fact, mean that you have developed a long-term or lifelong dispassion and inability for the subject, and that compared to actual mathematicians (professional or amateurs), you don’t really know it. Based on the statistics, it would be more likely to imply that latter. And generalizing this for even subject besides math, knowing that most people don’t learn a foreign language well in school and barely remember most of what they learned in school, we can understand this pattern of disinterest and difficulty in doing may apply to most or all school subjects.

So, are credentials really the best way to signify that someone learns something? At least for high school diplomas and many college degrees, no, not really. An easier example to prove this point is of driver’s license, over 99% of people on the roads are licensed to drive. How would you personally assess the driving ability of most people? How often do you or others ignore or violate traffic laws, by speeding and whatnot? Most people suck at driving, but they are licensed to do it regardless.

Now onto that first initial question, the former: is it necessary to actually prove that you know something before entering a professional field?

The answer is it depends on the field, organization, and context.

If you want to enter the professional art world, most artists understand that you don’t need a college degree to do so. Most professional artists have a portfolio and many hours of artistic experience to demonstrate that they know what they’re doing, and if that experience and portfolio aligns with the organization’s goals, then you can be hired. The “proof” is the portfolio, and sometimes it’s run purely on the honor system. They just assume that anyone who would want to apply to some arts organization or group would know enough to feel comfortable applying.

Writing is similar too. Most writing jobs are based on winning contests or festivals, developing a portfolio or professional website, having many publications under your belt, and having people in the industry vouch for you. Having a college degree is completely unnecessary for most writing positions. So the proof is essentially having your own independent means of demonstrating your knowledge and having other people in your relevant community of practitioners vouch for your knowledge. Generalizing those two forms of proof, that is how we can replace credentials.

One important nuance though for external means of knowledge demonstration is with fields related to safety and health. You don’t want to just trust that someone knows how to be a doctor or to perform CPR or fight fires. In situations where it actually matters that people know what they’re doing and we don’t have the time to see if it’s actually true, demonstrating your knowledge before a recognized professional based on some kind of test seems reasonable. (Of course, the professionalization of midwifery has erased or ignored generations of knowledge regarding child birth, so this isn’t an automatic given for the medical industry.)

There are some additional considerations too:

There is no such thing as basic knowledge

Fundamental or basic knowledge is a fiction professionals and skilled amateurs create in order to help beginners get started with learning a subject, as well as providing a rule of thumb for skilled practitioners to follow, especially in unfamiliar contexts. It’s not actually real in the sense that is an intrinsic feature of the domain you’re learning. When you learn a domain to a high-level, you may already know that the basics you learned were only approximations and simplifications. You have to inevitably challenge your fundamental knowledge as you improve and adapt it based on new knowledge.

An example of this would be that there are only two sexes in human biology. This is a fiction that hides the far more complicated world of chromosomes, primary and secondary sex characteristics, sociologies of the human body, hormone distribution ratios, etc. There are chromosomal variations such as XXY, XYY, and XXX. There are people with ambiguous sex characteristics, such as breasts and a penis. There are people we consider to be men who lack testicles or penises. There are men with higher estrogen ratios than other men or even the average woman. There are intersex people. The two-sex binary is not just a transphobic and queerphobic fiction used to simplify genetics, human biology, and sociology for beginners in those subjects, it’s also a deliberate tool of erasure used to deny the existence and rights of intersex people and unconventional men and women, as well as non-binary people.

A lot of these problems are caused by people having negative working models of domains not just with biology, but reality itself, with the false big lesson idea that for any one category of entities, there is a strict divide between two subcategories. I.e. black and white or dualistic thinking. Fortunately, reality doesn’t actually work this way for people who are well-versed in understanding, and highly skilled reality practitioners understand that dualisms are tools made to simplify the world, but do not actually accurately represent it. In reality, there are countless subcategories in any category of things, but for all extends and purposes, there are at least 3 subcategories (e.g. guys, girls, and non-binary folks in the category of humanity).

“Fundamental knowledge” is meant to give you a start, not to be the anchor you rely on forever. If you want to get good at anything, you have to update your basics in accordance with new information, and develop more elaborate, complex, and nuanced basics knowledge in that domain.

Resumes and CVs suck and we shouldn’t continue them

Everyone hates writing them. No one wants to read them. Portfolios, performance demonstrations, vouching, and literally anything else would be preferable. Please discontinue this. You don’t need my whole school or work history if I’m trying to sculpt for your museum.

It’s not necessarily a bad thing to be ignorant

You can always learn on the job and most people do anyway. Even if someone applied to a position that they didn’t have the necessary knowledge for, they may be able to learn quickly.

Home and community education could solve a lot of problems

In many Indigenous cultures and even in pre-industrial Western societies, formal qualifications didn’t exist. Most children learnt by actively participating in their community and mimicking their parents, so they grew up to possess all the necessary skills to partake in their lives and cultures. And most people trusted this system enough to have others be responsible for medical care, home construction, religious affairs, engineering projects, food issues, etc., essentially trusting their neighbors to have paid attention their entire childhoods in order to be fed, cared for, aided, and live in a stable building. And it worked.

Maybe there is a lesson to be learnt here in regards to promoting more non-Western and informal means of learning, such as apprenticeships.

The honor system isn’t a terrible solution either

And if it becomes obvious that the person truly doesn’t actually know what you’re doing, and you have no intention of teaching them, just kick them out the group and find someone else.

People are biased and may give a false assessment of your skill

Especially with the omnipresent issue of bigotry such as homophobia, racism, and sexism, to name a few, the professionals or skilled and respected amateurs in your field may not give you a good assessment based on your race, or sex, or sexual orientation. Addressing bigotry as an issue is essential to this working for everyone.

The professional-amateur binary is weird

More on this latter, but basically the assumption that professionals — people who perform a domain for a living- are more trustworthy, reliable, and competent than amateurs — people who don’t — is wrong. For example, I’m not a professional skateboarder and I’ve never met one, but there is considerable overlap between the best of amateur skaters and the average professional one. And even ignoring that, you don’t need a professional to learn how to stake or to do skating-relating things. Learning from a knowledgeable amateur is perfectly acceptable and in fact, that’s how most skaters learn (excluding by themselves).

Taking this further, there are plenty of professionals who have all of the formal qualifications but straight up don’t know what they’re doing, not just in skating, but everything else, and going only by their formal qualifications would lead you astray.

Competency is not a scarce resource limited to the professionals, it is something anyone can have and anyone can prove. We should evaluate competency based on your own standards and the evidence we are given (through portfolios for example or through vouching), not just how many degrees or certifications someone has.

--

--