"Finally, I said that I couldn't see how anyone could be educated by this self-propagating system in which people pass exams, and teach others to pass exams, but nobody knows anything."
- Richard Feynman, in Surely You're Joking, Mr. Feynman!
Universities are idiotic structures. This was a core belief I've had over the last few years, and I've only become more convinced of its truth after actually beginning uni. I have a sneaking suspicion many academics would agree with me.
Before going into any further arguments, I should specify that I'm mainly speaking about the undergraduate system, although the graduate system also has quite a few massive problems. I also want to mention that my argument excludes two particular cases - medicine and law - around which the current system seems to work well, and I don't see anything particularly wrong with the way those subjects are studied. I'm not a doctor or a lawyer, but I don't need to be one to know that these are the only degrees which directly put graduates into a position of extreme responsibility. Medics and lawyers are quite literally in the business of life and death, and their degree involves a significant amount of time spent in courtrooms or operating rooms, as a result of which they gain practical experience in their field; it's impossible to practice law or medicine without this. Obviously, you'd want your doctor or lawyer to have a degree, and you wouldn't trust them without one. I should also mention that it's always easy to criticise, particularly when it comes to education. But I'm strongly against Chesterton's fence1, and I do have solutions for each problem I address.
With that out of the way, in approximately decreasing order of stupidity:
Attendance
The point of attendance in school is for teachers to communicate to parents if a child has gone missing. This makes sense. Schools are convenient institutions parents trust, and it's unreasonable to expect adults to sacrifice days at work to look after their kids. Everyone benefits, and if my kid told me they're going to school but ends up missing, I would definitely be a bit intrigued as to what's actually going on. If they're skipping school to get high, that's probably not great, but might be better than listening to teachers who ask for unquestioning obedience. If they're skipping school to work on a passion project, then I'd be very proud. If the kid is under the age of ten, I'd definitely be concerned by a random absence. This system is reasonable, maybe until middle school.2
Attendance is also very reasonable in any job, since you're expected to show up for the role you're getting paid for.
In university, though, student attendance is one of the most frustratingly stupid things I've ever seen.
University is supposed to be a young adult's first entrance into the real world, and their first experience with reality and responsibility. The responsibilities of a student are pretty meager, and not too different from the responsibilities at school: learn the material, study, pass your exams and get good grades. Instead of giving teenagers the opportunity to independently discover the value of learning on their own, attendance forces them to attend lectures just for the sake of attending them, and going to tutorials just to avoid getting reprimanded by admin. So instead of giving students the chance to learn true responsibility - if you don't learn this, you're fucked - they're treated like highschoolers that have to ask to go to the bathroom3, and the attendance of lectures becomes a forced exercise instead of an expression of genuine interest. Everyone learns at a different pace, and making the smarter students go leaves them unimaginably bored of beautiful subjects, whereas slower students would really benefit from a few hours in a library instead of passive listening in front of a whiteboard (or, God forbid, a slideshow).
If a student genuinely cares about their subject and their success, they will learn and study anyway. If they don't, or are on the fence, making them go to class definitely won't fix the problem. Treating students as adults means giving them the opportunity to make that choice on their own, and then walk away with whatever the consequences may be. For some, the lesson might be that they should've listened more and worked harder; others might be vindicated after getting perfect grades with zero attendance. The role of a student is to learn; why not let everyone decide how to achieve that, instead of stealing their freedom? It's demeaning, extremely patronising and disrespectful.
The fact that students' grades can be offset by attendance is even more cruel. A grade is supposed to be an assessment of a student's understanding in a given topic; if they manage to do all their assignments on time and demonstrate excellence/understanding through an exam, how is the number of lectures they've attended possibly related? It becomes classic goalpost shifting: the point of university isn't to learn, it's to attend; for what? To avoid being punished, instead of being rewarded. This is a completely misguided incentive! Marking students down for attendance means that they stop going to class to learn anything: they attend to not lose anything. This is a very big difference.
Lectures, or the hegemony of dinosaurs
The format of university lectures began around the same time as universities themselves first started appearing. This means the format has remained largely unchanged since 11th century Europe.
The idea makes a lot of sense in the context of medieval Europeans. A university was an organisation which probably had a decent amount of money, and staying inside a big hall with hundreds of other people, especially during the winter, is a much better alternative to shaking inside a house with no heating. Since universities were almost universally run by monks, this was also a good excuse to get more people to engage with sermons; whoever didn't come to church could get a similar experience in university, so everyone is better off.
It also makes sense from a teaching perspective. The best way to transmit information at the time was, literally, to just tell it to a bunch of people. Writing anything down takes a very long time, and most people didn't know how to read. Books are reserved for the libraries of landed gentry, vassals, or monks; so it was pretty much impossible to learn anything unless you attend a lecture or take an apprenticeship under the master of a Guild. This is particularly true for visiting lecturers, who you will likely never see again if, for some reason, they've decided to grace your town with their presence.
None of this makes sense today. We can watch lectures for free from our beds, if we want to, or access almost any text in the world within a few minutes. There are no geographic barriers, no religious advantages, no particular learning advantages which come from passive listening in contrast to active listening or problem-solving (e.g. building a project, engaging with the subject on your own). If someone genuinely cares, they'll learn anyway. If they don't, well, they don't. What is the point of watching a lecturer stumble and fuck up or forget a proof, or watch them clumsily battle the PowerPoint which, for some reason, never seems to work and always takes twenty minutes to set up? It just seems obviously better to either do the reading on your own, or watch something on 2x speed, especially if you're somewhat familiar with the material. If not, I'm almost 100% sure that the lecturers you'll find on YouTube are better than the lecturers in your university. Even if they aren't, all your lectures are recorded, anyway; what's the point of going to the lecture hall? You're just wasting time, probably.
The best example I have for this is 3 Blue 1 Brown. Look at the comments; this guy just made a Python library, and has a semester's-worth of videos you can watch in one day. His explanations are undoubtedly the best ones on YouTube, and it's impossible for a lecturer to ever make anything as good, no matter how hard they try.
One of the big arguments against what I'm saying here is that a good lecturer can be an impresario, and convey a sense of the subject's excitement to potentially ambivalent students. This is definitely true, as Feynman himself showed. I've also been very lucky to have some excellent teachers. But that implies that the point of a lecture is entertainment, and the actual learning is done when you sit down and try to figure things out on your own, carefully reading and asking questions. And, as I said, if something like Feynman or 3B1B won't excite you, your local lecturer probably won't either. Definitely worth a shot, but this absolutely shouldn't be mandatory for anyone to attend. Also lectures are often boring because they're usually taught by decaying professors who clearly haven't done anything fun in a very long time. Lots of exceptions here, of course, but unless your professor is a superstar or someone you want to personally meet, I don't see the point of attending at all. You will learn more and faster on your own.
Maybe my experience is also tempered by what I saw in lectures, which was mainly Chinese kids watching anime or playing Minecraft. Nothing wrong with that, but it's pretty pointless; they could've just done that at home, or at least somewhere more comfortable than a lecture hall.
You basically don't learn anything, studying for exams is super boring and takes away time from genuinely valuable pursuits
I don't count 'learn everything to get 100% on the exam and forget by summer' as learning.
G.H. Hardy famously advocated for the abolishment of the Cambridge Tripos:
"It has often been said that Tripos mathematics was a collection of elaborate futilities, and the accusation is broadly true. My own opinion is that this is the inevitable result, in a mathematical examination, of high standards and traditions. The examiner is not allowed to content himself with testing the competence and the knowledge of the candidates; his instructions are to provide a test of more than that, of initiative, imagination, and even of some sort of originality. And as there is only one test of originality in mathematics, namely the accomplishment of original work, and as it is useless to ask a youth of twenty-two to perform original research under examination conditions, the examination necessarily degenerates into a kind of game, and instruction for it into initiation into a series of stunts and tricks. It was in any case certainly true, at the time of which I am speaking, that an undergraduate might study mathematics diligently throughout the whole of his career, and attain the very highest honours in the examination, without having acquired, and indeed without having encountered, any knowledge at all of any of the ideas which dominate modern mathematical thought."
And the great Freeman Dyson, in a letter to his parents while studying number theory under Hardy's guidance:
"Dyson and Lighthill were allowed two years to complete their studies, while many of their classmates were given their wartime assignments after a single year. 'There is no talk of examinations,' Dyson reported to his parents, 'or anything silly like that.' Besicovitch gave Dyson problems 'which were impossibly difficult but taught me how to think'. His mathematical style, adopted by Dyson, was to build 'out of simple elements a delicate and complicated architectural structure, usually with a hierarchical plan, and then, when the building is finished, the completed structure leads by simple arguments to an unexpected conclusion'. They took long walks when only Russian was spoken, and Besicovitch invited Dyson to his private billiard table at his lodgings in Neville's Court."
Taught how to think! Sounds like a wonderful privilege. Is this accessible only to the select geniuses of our time?
Obviously not. Dyson's professors gave him genuinely challenging work, and he took it on because he was in love with the subject; like all great work, he pursued it because he loved it. The motivation to learn was intrinsic excitement.
This is completely annihilated when the motivation for learning becomes exams. I don't know how else to argue this, it's just boring. You're not working on actually solving any fun, challenging, important problems; you're not really discovering anything, or even trying to; if you're smart enough, you're probably not even really being challenged, and you're definitely not being creative at all. Your motivation becomes centered around a grade. That is infinitely less exciting than working for the sake of something you love, and no matter how passionate you are about a subject, I guarantee you that studying for an exam will sap you of any appreciation you had for it, since studying becomes less about exploration and more about completing a chore. That sucks.
The solution I offer is simple. Instead of exams, let students perform a research project, or something analogous to a thesis, throughout the year - or at least give them the opportunity to choose this option over exams. They choose some area or problem they find genuinely interesting, and spend a year (or a semester) building something from scratch, perhaps with the assistance of a tutor or a professor. This is how PhD students are assessed for their vivas, so why not implement the same system for undergrads? Doing this kind of project necessarily puts the student in control of their learning, and will almost definitely result in much more being understood than what would be understood after studying for an exam. In the humanities, this can be a series of long essays about a particular topic, either chosen by the student or given by a professor; in the more technical disciplines, this can be an engineering project which results in something being literally built, or a lab experiment resulting in some kind of study, or, for more theoretical subjects, some kind of software or a paper. Anyone who wants to go through the standard exam route can do so, but more ambitious students can go much deeper and further into their subject, without the standard constraints of a predefined syllabus. I have at least three examples of this working:
- I wrote a paper on a master's-level topic last summer, and taught myself the equivalent of a semester of algebra in three weeks. I've forgotten most of the stuff I've learnt this year, but I can still explain 90% of the paper from beginning to end without looking at it, because I was genuinely excited about the work. I'm not saying this to show off or to sound smart: I think anyone can do this, provided that they care enough. My motivation came from looking at beautiful things and solving a puzzle, instead of getting an arbitrary ribbon. I remember it well because I was excited, and I'm pretty confident that the reason everyone forgets everything they learn as soon as the exam is over is because the process of exam prep doesn't invoke anything apart from boredom and dread.
- These two textbooks are legendary for a reason.
Why do you think so many CEOs drop out? Because they're bored, and have better things to do. Although I guess some credit should be given to universities, since if these people weren't bored, they probably would've never started such impressive companies.
I've learnt much more from these side projects, or reading I'd do on my own, or any one of Gwern, Tyler Cowen, argmin, Scott Alexander, LessWrong, Scott Aaronson, Dominic Cummings, Paul Graham, Zvi, Nintil, Patrick Collison, Michael Nielsen, Cosma, Feynman, Works in Progress, and Rudolf than from any class I've ever taken.
Credentialism
Everybody knows this already, but the main point of university today is to make hiring a very convenient, easy process. Companies can't afford to do background checks on everyone, so having a glance at where you got your degree gives a very straightforward process of evaluation. GPA obviously ties in with this, since getting a good grade implies that you (supposedly) learnt something and know how to deal with expectations.
This means that the main point of going to university (learning to become a useful, intelligent member of society) becomes redundant. It's true that Math 55 exists only in Harvard, but it's also true that anyone can look at the current course website and just do the problem sets. The goal of attending an excellent university isn't to learn something special, and for most people it's not even to meet an academic they're particularly inspired by; it's to signal that you know how to do things. I've never been inside a Harvard classroom, but I really doubt that the teaching is that much astronomically better than anywhere else. So the system becomes self-serving: the point of an Ivy education is to have an Ivy education. This breeds useless bureaucracy, and the universities become Too Big To Fail. Remember what happened last time someone was Too Big To Fail? They became a cancer.
The main problem here is that universities' existence become unjustified. They exist because, well, they've been around forever and will continue to be around forever. Obviously, they're also all businesses, which become a strange mix of insurance (if I don't go to uni I won't get a job) and a club good (wow, you went to X? Crazy!). So we have people lining up to Studio 54 for the sake of … making sure they can get a job?
If only it was just that. On top of bending over backwards for the most exclusive insurance in the world, as soon as you get it, you'll be paying a premium well into retirement.
Similarly to the second point, this gives the wrong incentives - the point of the credential is the credential itself. Try reading anyone's college application essays and tell me that you don't feel bored out of your mind or cringed out. Imagine being someone who does all this day, just spending week after week reading thousands of these shitty essays written by kids begging for your attention, all clamoring for the badge of honor which is supposed to protect from all evil. I wonder how many admission officers have ever read Kafka?
The solution here is also very simple. Either abolish the undergraduate system for almost every degree, or make the entrance open to everyone (like in Switzerland), and further contingent on exams (or projects). As far as grad programs are concerned, if you are interested in pursuing research, then take the relevant department's exam: if you pass, you are automatically admitted into the program, and if not, then maybe you can try again next year. This removes the absurd admissions process (e.g. one of the Princeton questions: "Which song is the soundtrack of your life at the moment?" [sic]. Are you fucking kidding me? Or the classic totally-not-a-joke:"If you don't get into Harvard, what are your plans for the upcoming year?"), ensures that research programs fund and accept only the highest quality students, and keeps some of the hiring-convenience credentialism clout while making it much fairer for everyone to get a shot. Going to an excellent university remains impressive, and stops being a self-serving credential. There's not enough physical space or teachers in any single university to teach everyone, but a lot of the teaching can just be done online, and the exams/projects can be hard enough to filter out everyone except super dedicated students. This system is very successful in EPFL and ETH Zurich, and optimises for excellence.
Many departments are completely pointless, and full of incompetent people (empty suits)
"But just because a subject isn't immediately useful/pragmatically important doesn't mean it's not valuable!"
Of course, and I'm not advocating for the abolishment of departments. There is actually tremendous value in 'useless knowledge', mainly because it's often very beautiful - this is particularly true for my own favorite subject, which is so far removed from reality that it effectively has zero useful value. But then why burden academics with the additional responsibility of teaching, if they're in university to do research? And why sell students on the idea that university is an institution for learning, when it's actually an institution for academics to work? These seem like two completely different goals, and I don't see why they should be grouped under the same organisation. We can leave the departments alone, but then acknowledge that a university is a research center, and not a school. I guess the business model here is that subjects which don't generate revenue need funding regardless, and the easiest way to get a shit ton of money is to take it from parents' pockets. The fact that universities have become effectively immune to market forces means that they can be arbitrarily lazy, greedy, or just inefficient, and therefore let many incompetent people rise through the ranks, especially in the humanities4. See the Sokal affair for a demonstration of what I mean, and the IAS as an extremely successful example of a research-centered institute.
God this system is so fucking dumb
1 The classic statement of Chesterton's fence is along the lines of 'if there is a fence in your neighbourhood that's existed forever, but no one knows why it's there anymore, then you shouldn't get rid of it, because you never know what it's actually for. Don't destroy what you don't understand!'. I think a much more reasonable take is anti-Chesterton's fence: 'Unless we have strong justification for a currently existing system, why should it be kept alive? We should destroy it as fast as possible and replace it with something better.' ↩
2 I'm quite proud to say that I regularly skipped school almost every Friday for a year or two, because all of the worst and most useless classes were on Friday. This was a time that I'd usually spend exploring something interesting to me or reading. The absence and homework reports were some of the most cherished objects in my room, and I taped them above my bed. I'm very lucky to have a mom who approved and actively encouraged this. ↩
3 ??? How is this not extremely disgusting to teachers? ↩
4 I am not, in any way, saying that the humanities are less important or less impressive or less whatever - they are probably more important now than ever before, since most technical subjects will become quickly automatised by AI. But it is true that these subjects are much more susceptible to bullshit. ↩