Is the Only Value to Traditional Education the Credentials?
Posted: Sat May 15, 2021 2:32 pm
By "traditional" education what I mean is the standard compulsory schooling system familiar in the Anglosphere... primary -> college (maybe with an intermediate) in NZ, K12 in the US, primary -> "big school" in England and Wales and so on... plus university after that, rather than pre-institutional education, just to be clear.
I'm sure most of you are familiar with the idea of being self-taught or self-learning or whatever people call it. Usually what happens is someone (probably a teenager) used books/the internet/other resources and cultivating knowledge in some specific area (typically, (military) history) and is now extrapolating from that to claim one or both of (1) "this is a viable model for educating everyone at all ages" and (2) "institutional/traditional education doesn't care about learning".
Look, I'm perfectly willing to believe that the "actual" form of this position doesn't hold (1) to be true and, rather, we're just talking about people who have done (most of) the compulsory part of traditional education, i.e. they're at least 15, and we can talk about that but it's (2) that I really wanted to talk about when I had the idea for this thread.
When I think about the way education works for anyone older than about 12, what I think about is a process of "teaching" and then "assessment about that teaching".
There are obviously a bunch of different ways that the "teaching" can occur. In a university setting, the traditional model is "the lecture" which is basically a lecturer talking for an hour or two basically uninterrupted, probably with slides or other visual aids. Another common technique is the so-called "flipped classroom" in which the teacher gets the class to read up/watch videos on a subject prior to the lesson, which then takes the form of a process of engaging with that learning. In my experience, a large part of that engagement could well be writing paragraphs as a group about things in the prior learning, but you could argue that the "Socratic Method" is basically a flipped classroom approach. And then there's a lot of room between these two extremes and outside of a university context, there are probably quite a lot lessons on each module before assessment (e.g. in history at school we'd usually have about eight or more weeks per unit plus maybe two to three weeks of doing the assessment if it was internally assessed).
In the self-learning model, it's all flipped classroom all the time but there probably isn't any checking in... not with a teacher and not with a class. Why? Because that's the whole point... it's a single individual just doing some stuff by themself.
To my mind, self-learning doesn't exist. Oh, sure, you probably do genuinely learn some trivia, but the "education" bit is in having and cultivated knowledge through a process of contact with other minds. In institutional education this is formalised in the "assessment about that teaching" stage, but there are a lot of different kinds of assessment and they're not all made equal. However, there is a lot of informal contact through the teaching stage, both between learner and educator and between learners. Indeed, a lot of assessment will explicitly or implicitly be intended to put learners together.
Without the contact part, self-learning is just a recipe in entrenching whatever ideas an individual already has within their mind. The people who advocate for it or hold that it exists, usually don't think about the case of someone finding, I don't know, the Protocols of the Elders of Zion or, perhaps, some kind of Intelligent Design textbook and "learning" from that sort of material. Except, of course, they will and do... this is a big part of how conspiracy theories propagate: people try to teach themselves things that they have no idea about how to start learning. And the value of "assessment" isn't in the credentials that come out the end (or work towards a credential), but instead being able to see where your thinking is. Or, in a crude sense, if you have actually understood something.
(And, yes, this applies with coding, too. For example, in R you shouldn't use loops. However, loops are easy, they do what you want them to do and the alternatives, even if you've been taught them, are harder to remember even if, in the end, they're shorter, usually, and more efficient. But, yes, in general, I think the fact you can see whether you've coded something right from whether the code works means there's an inherent "contact". (I would also argue that loops are easier to follow than apply etc for the uninitiated reader.))
Of course, you do get places where you can apply that knowledge from self-learning but those situations are entirely incidental to the education. I would characterise the knowledge gained from, say, participating in an online forum as "incidental learning".
But, hey, that's just me... what say ye, NSG? Do people put too much stock in the idea of being self-taught?
I'm sure most of you are familiar with the idea of being self-taught or self-learning or whatever people call it. Usually what happens is someone (probably a teenager) used books/the internet/other resources and cultivating knowledge in some specific area (typically, (military) history) and is now extrapolating from that to claim one or both of (1) "this is a viable model for educating everyone at all ages" and (2) "institutional/traditional education doesn't care about learning".
Look, I'm perfectly willing to believe that the "actual" form of this position doesn't hold (1) to be true and, rather, we're just talking about people who have done (most of) the compulsory part of traditional education, i.e. they're at least 15, and we can talk about that but it's (2) that I really wanted to talk about when I had the idea for this thread.
When I think about the way education works for anyone older than about 12, what I think about is a process of "teaching" and then "assessment about that teaching".
There are obviously a bunch of different ways that the "teaching" can occur. In a university setting, the traditional model is "the lecture" which is basically a lecturer talking for an hour or two basically uninterrupted, probably with slides or other visual aids. Another common technique is the so-called "flipped classroom" in which the teacher gets the class to read up/watch videos on a subject prior to the lesson, which then takes the form of a process of engaging with that learning. In my experience, a large part of that engagement could well be writing paragraphs as a group about things in the prior learning, but you could argue that the "Socratic Method" is basically a flipped classroom approach. And then there's a lot of room between these two extremes and outside of a university context, there are probably quite a lot lessons on each module before assessment (e.g. in history at school we'd usually have about eight or more weeks per unit plus maybe two to three weeks of doing the assessment if it was internally assessed).
In the self-learning model, it's all flipped classroom all the time but there probably isn't any checking in... not with a teacher and not with a class. Why? Because that's the whole point... it's a single individual just doing some stuff by themself.
To my mind, self-learning doesn't exist. Oh, sure, you probably do genuinely learn some trivia, but the "education" bit is in having and cultivated knowledge through a process of contact with other minds. In institutional education this is formalised in the "assessment about that teaching" stage, but there are a lot of different kinds of assessment and they're not all made equal. However, there is a lot of informal contact through the teaching stage, both between learner and educator and between learners. Indeed, a lot of assessment will explicitly or implicitly be intended to put learners together.
Without the contact part, self-learning is just a recipe in entrenching whatever ideas an individual already has within their mind. The people who advocate for it or hold that it exists, usually don't think about the case of someone finding, I don't know, the Protocols of the Elders of Zion or, perhaps, some kind of Intelligent Design textbook and "learning" from that sort of material. Except, of course, they will and do... this is a big part of how conspiracy theories propagate: people try to teach themselves things that they have no idea about how to start learning. And the value of "assessment" isn't in the credentials that come out the end (or work towards a credential), but instead being able to see where your thinking is. Or, in a crude sense, if you have actually understood something.
(And, yes, this applies with coding, too. For example, in R you shouldn't use loops. However, loops are easy, they do what you want them to do and the alternatives, even if you've been taught them, are harder to remember even if, in the end, they're shorter, usually, and more efficient. But, yes, in general, I think the fact you can see whether you've coded something right from whether the code works means there's an inherent "contact". (I would also argue that loops are easier to follow than apply etc for the uninitiated reader.))
Of course, you do get places where you can apply that knowledge from self-learning but those situations are entirely incidental to the education. I would characterise the knowledge gained from, say, participating in an online forum as "incidental learning".
But, hey, that's just me... what say ye, NSG? Do people put too much stock in the idea of being self-taught?