IMHO this is true in a deep way, even more so for little kids than big people. From infancy children want to "do-with" and as every parent knows "telling-how" or "telling-to" works only in the most rare of circumstances, with only the most mature of kids. But a kid who has learned via "do-with" then graduates to "do-myself" and then to "i-did-it-myself" the autonomous feeling from which is the greatest. Then groups can learn "we-did-ourselves" etc.
With big people of course the "apprenticeship beats classrooms" manifests in so many ways- everyone who "learns more in 4 months of work than in 4 years of school" is familiar with this.
I love (public) schools and academia and think on the whole they are treasured institutions, but there is a degree to which their presence and mode of operation leads to deprioritization of "doing-with", and this is very unfortunate.
While I do tend to agree with you and I'm a firm believer in "learning by doing", which I can especially see in my (young) kids but I never got the "learned more in 4 months working than 4 years of school" premise/idea.
Sorry in advance of this a tangent and maybe I'm just lucky that I'm the product of a good education or perhaps I have always had the wrong jobs, but while I of course learned tons by working I have never been able to compare what I learned at university and on the job, not because they are fundamentally different, just different.
I never learned or more importantly had time to deep dive into an issue for months of time on the job compared to university. Things have to be solved fast, go go style a client is waiting. Perhaps this is a product of the jobs I have had, don’t have enough data to say otherwise.
But I have never at university learned how to for instance lead a team or communicate effectively to non-academics. This is not something I expected of my university of course as these are very different skills.
But what I did learn was to understand the computer at a deeper level, which helps me every day and I find to be crucial to my day job.
I value my university degree/time dearly, but it is of course not the real world and of course not everything is applicable, but you never really know what is before you need it I guess.
By this not saying that teaching shouldn’t evolve, innovate and/or change for the better, I think I’m just saying it is doing some stuff right, and I believe (no data backing here) that our industry wouldn’t get or continue to get the innovations we have achieved without at least some university foundations sprinkled in there :)
I can back up your experience. My university studies were focused on maths and theoretical computer science, and I've learned a lot of things there that are unlikely to be learned on a regular job. That said, at least my PhD did feel a bit like an apprenticeship in how it was structured. It was an apprenticeship for becoming an academic - and while I don't work in academia anymore, that has given me some useful skills that most of my peers don't have. (Of course, there are also skills that I don't have but some of my peers do. In many teams it's useful to have a mix of academically and not academically inclined folks, and the ideal mixture obviously depends on the kind of work the team does.)
Sure, not everything is applicable but the solid foundation goes a very long way.
I also see others fail at some tasks because they are not able to understand the foundations. Not saying it would be solved by going to university, but it might have helped to study it without tight deadlines and in a world where nobody has time to wait
I’ve been working in a technical field for 20 years without a degree. In my professional experience, every new college graduate that I onboard ends up needing many months of “do-with” time before they — and their fresh updated knowledge — are useful to the organization.
Why do you think apprenticeship doesn't give profound education? The idea of apprenticeship is that you are learning directly from the master, who defines the curriculum. It starts with the theory which the field requires, you just do real work instead of artificial test cases to foster the knowledge.
I think that apprenticeship was always better than the classrooms, but we were forced into the classrooms by the circumstances of the modernity: massive population growth and urbanization. In the 19-20th centuries we simply couldn't afford to have master-apprentice relationships when masters were few and pupils were many. Also quantity was preferred over quality.
Nowadays apprenticeship is seeming more feasible, with current demographics and preference for quality labor.
If we’re talking about all areas of study, not only traditional trades, then what is an apprenticeship, really? Is it not merely an education with very low student-teacher ratio? If that is the case, then of course it would beat the classroom. It also won’t scale, hence classrooms.
This article does a pretty good job of supporting the advantages of apprenticeship. It does not, however, provide any support for the self-directed pseudo-apprenticeship that it advocates. I have little reason to believe that this model will work for anyone who wasn’t already a capable autodidact. Apprenticeship without the master are not apprenticeship at all.
> what is an apprenticeship, really? Is it not merely an education with very low student-teacher ratio?
No. There are legal requirements for it [1].
In particular, it is a job. Even if you taught algebra one-on-one to students, that would not be an apprenticeship, because the teacher isn’t a worker using algebra for their job, and the student isn’t earning $15 an hour to learn how algebra is used in that profession.
Yeah, so the original article doesn't make any sense as an approach for society. Parents want their kids to learn algebra in the hope that they might get into an engineering program. Apprenticeships don't enter into the picture at all.
Imagine a mathematics professor trying to apprentice a teenager. It wouldn't work, since the knowledge gap between a high school grad and a working mathematician is huge. The kid wouldn't have anything useful to contribute for a very long time.
Perhaps this is why we only see apprenticeships in trades and the like. Those jobs don't require a huge base of knowledge just to be able to understand what's going on.
This is an amazing story. I don’t think it’s what the author of the article had in mind though. It sounds like the original argument for apprenticeships is supposed to be a large scale thing, for potentially all students. I can’t see that working for the vast majority of stem students, only the very rare gifted ones.
I think this article is not advocating for the legal apprenticeship, but instead for the theory behind. A _good_ apprenticeship would not need pay because the education side would compensate the work side. But of course then lots of "entrepreneurs" would take advantage of this, and so it's not possible to do it for free.
As a closely related example, internships can be free in Spain, and it happens quite often that the business just rotates interns without teaching them anything. So when the maximum legal time of unpaid internship is over, they just find a new intern. Sometimes they do pay, but pennies on the dollar because there's no minimum requirements.
That sounds like making up a new definition of "apprenticeship". Why would an apprentice not need to be paid? Is this only for the independently wealthy, or do they have to get a second job on the side in order to eat?
Because they would be actively taught. It's somewhere between a university class (where the student pays) and a job (where the employer pays), the question is more IMHO about where the line of learning vs working falls. Apprenticeship seems to be catalogued as more work than learning, but that's exactly what the article discusses, how it didn't use to be this way. It seems like this now*:
-$$$ focused tutoring, maximum learning
-$$ cram schools, high focused learning
-$$ private schools, good learning
-$ public schools, mediocre learning (US-based)
?? apprenticeship
+$ 0-1 internships, learn by following
+$$ normal job, no specific learning time
+$$$ senior/executive, just work
Note: I'm not 100% confident with these numbers since in my home country things are a bit different, e.g. public university is better than private one, and internships are normally free.
Example: I taught in a bootcamp. Students paid a non-trivial amount of money for learning, and they were taught in classes of ~20 students. I think that would have been the perfect situation for an apprenticeship. I would not mind teaching someone programming if they help me with design for example. Help going both ways seems just like a normal exchange.
For me, the definition is basically: it's a full time job, but you're being paid to learn instead of being paid to deliver. Most of these companies are investing in apprentices with the hopes of hiring FTEs in a different way.
"We might not bring back apprenticeships, but by bringing back the features that work, we can all hopefully learn a bit better."
The piece seems to ignore that apprenticeships are still a thing in quite a few countries around the world. Here in Germany they developed since the medieval times this piece seems to find so great, and usually include stretches of classroom attendance to get the theory across because you just cannot learn everything by doing in such a setting, I would guess that's even more true today where even trade companies are quite often relatively specialized.
True, but in the United States we've spent decades disdaining them and distributing propaganda that college is the One True Way. This has had many bad effects: http://seliger.com/2017/06/16/rare-good-political-news-boost... that are obvious to me, having spent many years teaching freshmen who don't want to be in college and would be better served in alternate forms of education or work.
In the states, the authors model is absolutely true. Pick up a high school algebra text and it's a more colorful with some meager attempts at explanation version of a Springer book.
American education,at least what I have seen, is devoid of applications.
To be fair to educators, we do have a cultural issue here in that for manyadolescence is a time spent learning a lot about fads, entertainers, and gossip. It is hard to motivate anyone to learn when there's no external motivation.
Motivation is key. The successful apprentice is one who is motivated to learn from the master. The only problem with the classroom method is that it is often demotivating for the students.
With all the online information available, autodidacts have a great opportunity.
Autodidactic learning is not always ideal: you may miss some fundamentals of the trade that may be important when you're are setting the curriculum yourself. In some trades, this can potentially endanger people. The other thing autodidacts usually don't get is the unwritten oral part of apprentice training that is passed on informally and often even implicitly.
You could make similar vague criticisms about academically trained people though. They may enter the work force with little practical skills, their academic sensibilities may interfere with pragmatic decision making... That’s no more of a strawman than your comment.
Personally every time I’ve taught myself something, I’ve started with the fundamentals. I taught myself databases from the ground up. I started with learning query optimizers, data normalization and MVCC.
I think your criticism is much more dependent on the individual than it is how they were trained.
I don't see how my argument is a strawman. I have learned a great deal of stuff in an autodidactic fashion myself. A big portion of my Ph.D. work is based on things I learned autodidactically. I know that I missed existing knowledge, didn't always look into the right places etc... I have been extremely lucky that none of the work I have published so far hasn't been covered in previous work that I missed.
Research, as much as trades, has its own oral history, informal institutional knowledge and so on. It's being told by the supervisor that it's not worth pursuing a certain course of action because there's a non-obvious roadblock that leads to unpublishable negative results. Or it's being told that you should try procedure X in this specific step in your experiment because $obscure_reason or "it just works better when you do it this way" and a shrug. There is tons of experience passed down from one generation to the next everyhere that is never put in proper writing. Getting access to that knowledge is just an enormous advantage.
Well both of us are really just putting forward our own anecdata relating to each method of learning. If we generalise that to apply criticism to larger groups of people, then I’d say that’s when we cross the line into creating a strawman. Each method certainly does have its pitfalls, but also it’s own ways of overcoming them. If we wanted to discuss the likelihood of falling into those traps (or more generally, the effectiveness of autodidactic learning vs formal education), then we’d need some data (I’m not aware if this has been studied, maybe it has?). We’d also need to put a fair amount of thought into defining the criteria for measuring that effectiveness. For example, I taught myself to change the oil in my car, if you assess my abilities as an oil changer, I think I’d be quite competent. If you assessed my abilities as a general mechanic, I’d be woefully incompetent.
> I have been extremely lucky that none of the work I have published so far hasn't been covered in previous work that I missed.
This reminds me of how Ramanujan is said to have independently remade numerous previously made discoveries in mathematics. Certainly a risk of autodidactic learning.
> There is tons of experience passed down from one generation to the next everyhere that is never put in proper writing.
Perhaps this depends on the field. For software development, huge amounts of this knowledge is available online. I set out to learn GraphQL recently, and after reading the spec, I gained most of my knowledge on the topic from a mixture of experimentation and reading online writing. But I guess that raises the question, did I teach myself these skills, or was I taught them by dozens of random stack overflow users and blog writers?
Those don’t sound like the “fundamentals” of databases at all. They sound like pretty high level concepts.
When I studied databases in college, we started with how pages are stored on disk. That is fundamental to both data and indexes, and everything that uses them. Query optimizers depend on almost every other part of the database. In the databases we wrote, I don’t think we got to that.
I think the best example of a middle-class modern apprenticeship that people will be able to relate to here is the PhD. It’s an apprenticeship in research where you’re paired with someone who has mastered the craft and passes it on 1-1 to you by helping you do practical work (actual research with research outputs) outside of the classroom.
I think ideally a PhD is an apprenticeship. Practice in my experience is unfortunately not like an apprenticeship, instead it's a low paid job that can be quite abusive. Plus many of the things you "learn" aren't good for research in my view, e.g., I think "publish or perish" causes many researchers to avoid projects with the largest impacts. This is treated as "smart" but I strongly disagree.
That's what makes PhD work a good example I think, because a low-paid job that can be quite abusive describes apprenticeship very well. It's Inherently reflective of the quality of the master. If the master is an abusive drunk, the apprentice is going to have a bad time. Also the apprentices historically had to do much grunt work like (in blacksmithing, to choose a profession) churning out nails. Produce or perish.
I think there are advantages to bringing back parts of apprenticeship, but it is not a system without flaws or weaknesses.
I personally don't mind the grunt work and don't consider that abuse. Grunt work is either outright useful or a learning experience.
If blacksmiths were academics they would produce a large number of faulty nails. (One might argue the vast majority would be faulty!) I doubt blacksmiths would find that acceptable, but many (if not most) academics don't seem to care too much about quality as long as the paper passes peer review.
A PhD can become abusive when the advisor wants to produce more work when someone has done more than enough for a PhD. At that point they have a full researcher getting paid only a fraction of what they could be.
PhD students are paid less than apprentices last I checked, too. According to this page apprentices are paid roughly 1/3 of what professionals are paid:
So far, at no point during my PhD was I ever paid 1/3 of what my advisor is, even if you factor in benefits. I haven't even received a raise (to my knowledge), but he has.
Again, ideally a PhD is like an apprenticeship, but my experience suggests that case is rare.
My experience mirrors yours. A Ph.D is an arrangement where your advisor can keep asking for more work, beyond all reason. Unlike a (say) blacksmith apprenticeship, one cannot just leave with "six years of apprenticeship in X"... one instead must quit one's Ph.D.
Yes, peer reviewed papers are nails in the analogy. (Non-peer-reviewed work might as well not exist to many academics in my experience, regardless of its quality.) There are good nails and bad nails, just like there are good papers and bad papers, though the criteria for what makes each good differ.
There probably is some innovation in nails still going on. I imagine it's mostly about manufacturing methods and materials.
The same most definitely happens with real apprenticeships. In countries like Switzerland, apprenticeships start after the 9th school year and take around 3 to 4 years. During the apprenticeship, the apprentices still have to visit school and write tests. This is a common point of conflict. Let's imagine you are a 17 year old who wants to become a cook and your boss makes you clean the kitchen until 2am and the next day you have a test on algebra at 7:15am. There are laws against this kind of stuff but enforcement is really spotty.
As I’ve pointed out in other threads, I used to mentor people in audio engineering, this is now legally very risky as I am immediately vulnerable to a wide swath of labor regulations in NYC.
Not really worth it anymore.
Meanwhile it is commonly known to be true in many industries that someone is mostly useless, work productivity wise, when they graduate college. It takes years for them to “know anything.”
I’m not sure what the solution is, but obviously apprenticeships are a huge missing piece.
Do you still hire college grads or just anyone? Sure the grads don't have any experience but it's better than anyone off the street that knows less than nothing.
Yep theory vs application. You learn on the workplace (if you're competent) what is and isn't worth knowing pretty quickly, rather than after 3-5 years
It's crazy to me that modern adult life is nothing like a traditional classroom format, and yet we spend 18 years preparing our kids for it by having them sit in a room and get talked at. Sure, there other kinds of activities but lecturing takes up such a huge chunk of the time spent in the educational system for most people.
Alternatives or complements like apprenticeship that can teach in a different way are really interesting and exciting to me.
A lot of adult work are fairly similar to the classroom in the sense that you're sitting at a desk doing stuff, usually with computers, usually shleping information around.
It would be nice if both had less of this quality. But still, a lot of education is a bureaucracy teaching people to be bureaucrats. If we are to have a bureaucracy, traditional schooling is certainly a training ground for that.
I don't know, I've been in plenty of meetings where the best action for my career was sitting still and shutting up for a few hours while someone talked.
Kidding aside, I totally agree that it's quite silly. I hated school when I was in it but as soon as I left I couldn't stop learning - there are so many opportunities for knowledge online now (and in real life too) with MOOCs, Youtube lectures, free software tools to play with, etc.
It’s worth keeping in mind that ‘education’ as we practice it was Roman, passed through cathedral schools for priests to the British Empire, (becoming) what we know of today as academia.
I'm a physics and computer science high school teacher, I think lecture takes up about 5-10% of my classes. This is not unusual in high school. (I'm in Los Angeles)
On the other hand, when I was a student in college it was about 95% lectures.
That's awesome to hear! I'm recalling history, math, science lectures, english classes. Just thinking back to both high school and college there was a lot of lecturing.
Some sports leagues draft kids out of high school. What if companies starting offering 4 year apprenticeships out of high school to the most promising students ? How much should they pay these 18 year olds for those 4 years ? Would the colleges lobby the government to ban apprenticeships ?
Look up the german Apprenticeship Model. We are doing this. The students get paid, but not the amount a learned worker gets.
Its called "Duales Studium" and is available for a lot of different jobs. Examples are: Engineering, Architecture, Trade, ...
At the and of those 3-4 Years you get a Bachelores Degree.
>The downside is that they'll only learn skills that are valuable for that specific company.
In my experience, university did not provide any skills valuable for any company, other than being useful for networking and a little bit of politics if you're involved with organizations in the university.
The big value, at least for 18-22 year olds, is from proving that they have the discipline to put in the work to get into a selective school and/or achieve high grades in difficult subjects. As an employer, my main concern when hiring someone is if they are capable, and if they are disciplined enough to apply themselves. If those two are true for a 22 year old, then there is a good chance they can be useful regardless of their specific education.
Anecdotes aren’t data - but even though I knew how to code long before my university CS course - having to do a CS undergrad is the only way most people will learn the boring-but-essential - or hard-to-learn parts of CS or SE.
For example - I built commercially successful (by my standards) software while in high-school that used O(n^2) algorithms (!!) - who is is fine for small applications - but the difference between someone “who can code” and “someone who can code well” is typically a BSc and an understanding of how to make things that can scale.
We’re all familiar with the trope of the medium-sized business exec who scoffs at the invoice of an SWE consultant arguing “my 12-yo nephew could have built that!” - and they’re not superficially wrong - but their 12yo nephew couldn’t build something that scales to millions of users and petabytes of data.
I’ve found a quick-and-dirty way of identifying the unwarranted self-confident types is to ask them how they’d quickly put together a CSV parser - if they give an answer involving `String.Split` you can tell they don’t have a degree.
> BSc and an understanding of how to make things that can scale
That certainly is not the only way to learn how to write scalable applications. (e.g. reading about techniques and/or building them).
> We’re all familiar with the trope of the medium-sized business exec who scoffs at the invoice of an SWE consultant arguing “my 12-yo nephew could have built that!” - and they’re not superficially wrong - but their 12yo nephew couldn’t build something that scales to millions of users and petabytes of data.
Scaling to millions of users and petabytes of data is not needed in many cases and a simpler solution can get the product shipped faster and with fewer bugs. Part of being a good SWE consultant is taking the time to gather the real world requirements, the potential future uses, and clearly explaining the expected operational envelope to the client.
> I’ve found a quick-and-dirty way of identifying the unwarranted self-confident types is to ask them how they’d quickly put together a CSV parser - if they give an answer involving `String.Split` you can tell they don’t have a degree.
I'd rather have someone who bothers to ask what the parser will be used to process (and what will be consuming its output) and gathers the real world requirements. Someone who confidently assumes they know what the CSV "spec" is and start coding a parser for the RFC 4180 general case can waste lots of time and money building the wrong thing. There are certainly cases where splitting a string on line breaks and commas is a good fit for the problem. There are also cases where that will parse data incorrectly and explode on files over a certain size.
> I’ve found a quick-and-dirty way of identifying the unwarranted self-confident types is to ask them how they’d quickly put together a CSV parser - if they give an answer involving `String.Split` you can tell they don’t have a degree.
I'm pretty sure it would be faster to put together a CSV parser using String.Split than to write a streaming parser to handle the problem. You did specify that you want this done quickly, right?
My immediate thought would be to make sure commas that appear in field values don't cause spurious splitting. After that, yeah, String.Split.
Can you say more about why this means I don't have a degree?
I wonder what the difference is between a classroom with a single teacher tasked with educating a room of people who might not be interested...and apprentice with one person who likely is interested.
To me it seems that of course that an apprenticeship works better for that task....but also not a fair comparison considering the classroom goals and structure for all sorts of reasons.
I also wonder about how well apprenticeships put forth new ideas or ways of doing things.
> classroom with a single teacher tasked with educating a room of people who might not be interested...and apprentice with one person who likely is interested
In my experience, the "should be apprentice" in a class is the one ends up being frustrated by the class environment because they care.
I see classes like commercial farms - everyone gets the same amount of water, nutrients, sunlight... An apprentice's master is like an expert gardener who notice when one plant needs less sunlight and provides shade. And provides calcium when one plant needs it.
That's why small time farmers are usually the ones who create freakishly big tomatoes, carrots, pumpkins... These freakish farm produces created through painstaking personalized education would be called geniuses if they were humans.
In small classes, students often get this personalized treatment. But majority of classes aren't small.
One big difference is that a classroom has 30 or 50 or 100 students, and an in apprenticeship environment you can have probably up to 5 students. The classroom is much cheaper.
Another difference is that in an apprenticeship you can "fire" the students. In a classroom there is a big pressure to keep the students there.
Also there is a different psychological dynamic in small groups... The teacher naturally knows and cares more about the student, and the student feels socially accountable (ie. "I want to make (so-and-so) proud."
And, in a real sense, the master and apprentices are a team.
And this isn't the "we need more money and teachers" argument I'm making, just that the large-group lecture format only works well for certain stages of learning and types of learners.
> I wonder what the difference is between a classroom with a single teacher tasked with educating a room of people who might not be interested...and apprentice with one person who likely is interested.
The teacher's job is to teach and frequently teachers rate those that just finished a class as more capable then those that have a bunch of experience. Whereas in an apprenticeship the teachers are doers and are training doers. I think every first time intern I've ever hosted is always astonished at how much quicker they are learning then they were in college.
I work in software supporting a field I knew little about (genetics). After 6 months of coding up tools, going to talks and kinda getting the gist, I started taking a couple classes. It really helped my fill in some gaps in my background knowledge and I’d wished I’d done so sooner.
So I think both learning methods have their place.
The intermingling of economic-focused and academic-focused educational systems is the core cause of this issue. Universities used to be for scholars, whereas apprenticeships were for tradespeople. Over the past century, universities have essentially replaced the role of trade school/master and thus you get the intellectual abstraction process applied to professions which have very little use for it.
There are very worthwhile areas of knowledge which have no economic value in the current system. Ergo the apprenticeship model makes no sense for a field like literature or philosophy. They belong in the academy, not the workshop.
Oddly enough, one college major is well known to have limited economic value: Music. Yet music education follows the apprenticeship model. Individual lessons, practice, ensembles, and performances, are central to the education of music performance majors.
Composition and improvisation (where relevant - mostly surviving today in performance on the organ, and almost totally dead in other instruments) are also taught the same way. Somewhat surprising perhaps, since there's actually quite a bit of intellectual engagement with these topics, too - but a heavily practical attitude has long been recognized as yielding the best results.
I'm a jazz musician, but learned improvisation on the bandstand. It was certainly "learning the hard way," and I'm not sure I recommend it.
I think one reason for needing practice is that in either of those areas, you're going to suck for a long time before you get good at it. Maybe this is true of writing too. The academic environment is a place where you can get that kind of practice.
> I think one reason for needing practice is that in either of those areas, you're going to suck for a long time before you get good at it. Maybe this is true of writing too.
Not sure even that entirely holds. A philosophy mentor that can follow your interests and provide/recommend short courses may be as beneficial as a college degree... Except the providers of work focus quite a lot on those fancy certificates.
The company I work for is about to take 2 software engineering apprentices for the first time. As far as I can tell, we're not equipped for teaching any theory and as CompSci for under 18s is fairly uncommon in the UK, it's unlikely they're going to have a solid grounding beforehand. Does anyone have experience on what kind of environment and investment of people is required to make a good apprenticeship for this field? Everyone at the company did a CompSci or similar degree, but I feel that apprenticeships are a fundamentally different model.
Suggestion: Could the company/apprentices blog about the experience? (Without necessary getting too personal). It would be a great way to share with the community and good fodder for a discussion and the following round of apprenticeship experiments :-)
We're not big bloggers if I'm honest. Everything we do is closed source, corporate and targeted at enterprise. Maybe it'd be nice to do, but it's just not the culture we have in the office.
The ideal situation is to pair each apprentice with an experienced dev, with a fixed time allocated every day to help them.
Yes, it's costly. Yes, the experienced dev needs to be ok with it.
Then you find small tasks you know people can google their way around and that you know won't lead to a disaster if they are done badly or if something break.
Then you increment.
Apprentices are very slow and very demanding at the beginning. Once they are productive though, they have the best cost/benefit ratio of your company, provided you did the first steps well.
Source: been an apprentice for 5 years, and all my classmates were.
The missing piece seems to be some way to ensure the apprentice sticks around once they become valuable. If they are immediately poached with a high salary you can't afford to pay, you get nothing for your troubles of teaching them.
Apprentices can be profitable. I was one and later had access to invoices we sent.
Ok that company did some ... creative ... billing.
But even without that I believe it's not too hard to break at least even.
Doing low-priority work, stuff at the bottom of the backlog is im my exerience totally possible fairly quickly, billable and can offset the senior time used because code reviews require more feedback and some general guidance usually.
Apprenticeships here have a fixed government set duration if you want your degree. And it's difficult to change companies. The durations are set in a way it's profitable to take apprentices. During the last year they're usually as productive as any junior dev, at a pocket change salary.
The obvious solution to this would be to charge them for the apprenticeship, financing it yourself with a loan from master to apprentice (which then immediately goes back to the master as "tuition"). If they are immediately poached at a high salary, they'll have to pay back the loan.
The traditional solution, enslavement, has a bad reputation today.
I think the sticking point is finding small tasks that are meaningful. You can generate that sort of thing for someone doing a 2 week placement, but I think it'll be tricky for much more than that given the nature of the work we have available.
Did you have any structure to your learning to ensure you covered basic topics, e.g. OO, data structures, algorithms, etc?
No my internship was more of a learn or die type of thing. But the dev in charge of me, although he sucked at teaching me, was very tolerant of my mistakes, even grave ones. He gave me important tasks, and would shield me from the heat from management when I messed up. So I learned from copying his talented but cryptic work using trials and errors.
I kept at it because, although he kept being very rude (he was so with everybody), he made sure I was part of the team from day one. When we had lunch and I had no money, he always paid for me so I could eat with everybody at the restaurant. When we had extra pay for being on call during nights and week end, he included me in the rotation, despite perfectly knowing I would just wake him up to solve the problem as soon as the phone rang.
We are still friends to this day, and I came back working with him several times after my internship.
It depends of how the company behaves. It's no different than with any other employees.
Give people the feeling they are part of something, that their day to day actions have a meaning, and enough money for their trouble, and they'll stay.
Say "we are a family" in your speeches and write "values" on the walls while you are just fitting cogs in the machine while optimizing for numbers and they will leave, or do the bare minimum they can if they need the cash.
This is not something you can fake. Not for long at least.
So it's not related to apprentices.
However, apprentices will more likely fit in your culture, accept changes, etc. Because they grow with you.
I'd say there's 2 main groups of people - those that have been there 20+ years and aren't likely to leave and those under 30 that started as grads or interns and have been there a few years. I don't consider myself particularly loyal, I just happen to like the work and team I'm on and if that stops being true I look elsewhere.
I have a friend who is in this program now to become an electrician. She has been working, for pay, on active construction sites for over a year, after completing introductory training/classroom work.
Accepting an apprentice means taking a preliminary dip in your own productivity, until such time as the apprentice can contribute a net gain to your unit. Someone has to take the brunt of that-- with electricians, for example, the introductory training helps them not to kill themselves.
Many subjects combine an academic component with an apprenticeship component. Physicians spend a lot of time in clinics as med students before they become full-fledged doctors. Teachers have to student-teach prior to becoming full-fledged teachers.
Cosmetology, hair styling, auto mechanics, the list goes on. All of these professions combine some academic study of principles with intense, prolonged hands-on training.
In the US if you look at the regulations for unpaid interns, the employing organization must provide experiences equivalent to the amount of learning the intern would recieve in a college course. There's some idea of treating unpaid interns as unskilled labor, but that as far as I know is against the intent of the "unpaid intern" concept and is, technically, illegal.
The author is messing up with two very different topics: (kind of fundamental) education in classrooms (that existed 500 years ago too) and practical craftmanship, that exists now in a broad range of professions including software engineering. So the text looks like a preparation of strawman.
That is an interesting point (framing as a strawman).
Something was bugging me in the article, though I think I agree with many of the points made, and your comment recalled it to mind.
Both fundamental and practical education matter in most realms.
Practical example: medicine (modern allopathic "western" medicine, i.e. the job of a physician, something I know and can comment about relative to both basic science and software).
The author implies that time is "wasted" in basic science, when students could simply be more involved with patients.
Leaving the value of patient contact aside (most schools do involve students early these days to provide context and motivation), one of the whole points is to provide a multi-level framework for better incorporating later knowledge. This allows a person to adjudicate new findings and therapies, guide their patient and practice, figure out novel situations of atypical presentations (often due to overlapping processes or red herrings).
Anyhow, I'm sure others can think of similar examples in fields they know.
Also, perhaps one of the western European folks here can comment, but I seem to recall that trade apprentices in, say, Germany, generally still have a pretty decent core knowledge of math, language, etc from school. I.e. pretty comparable to an "academic track" US high school student, excepting AP courses. True?
> Also, perhaps one of the western European folks here can comment, but I seem to recall that trade apprentices in, say, Germany, generally still have a pretty decent core knowledge of math, language, etc from school. I.e. pretty comparable to an "academic track" US high school student, excepting AP courses. True?
I'm not sure about Germany, but in Switzerland, "school" is mandatory until the end of middle-school. After that it can really be anything between zero classes, 100% on-the-job apprenticeship and general, academic track high-school. Selection is mostly done by grades, since they're a good indicator whether school/classes/lectures work for a student or not. Most people go to trade schools and will indeed attend core math, language, geography/history classes. But the amount and depth of these classes aren't close to general high-school. They cover maybe 50-60% of the material. And then you'll have people in dual school/on-the-job training, where core classes are really reduced to the bare minimum.
Note that all tracks offer bridges to tertiary education, at vocational universities or even sometimes proper universities, so that you aren't stuck forever if you didn't take education seriously as a teenager.
I think the diagrams of the pyramid vs the puzzle was a good one and reflect how I learn, I personally don't get concepts until I have touched them. I was never really good at taking a concept from a book and visualizing it.
This is why I excelled at Chemistry, Computers and real world Physics but never really excelled at basic math (to this day I convert fractions to decimals because my mind does not think it fractions). Anyways what I came her to say is once I get the basic concept from hands on labs, then it is very easy for me to get the math, because I can visualize what the math is doing. I seem to only learn from real world application and by doing it. So I think there is value in learning the how, and then layering the theory on after that fact. It seems to stick for me. Then again we all learn differently.
I personally learned Geometry and Trig from carpentry and 3d modeling.
I learned algebra and statistics from building and modifying internal combustion engines.
I learned business math from building accounting systems.
All of these I learned after hand on apprenticeship and it made it very easy for me to grasp and retain the mathematical concepts.
I have joked before about starting the "Boys School for the Mechanically Inclined" to teach boys that learn like I do. I was fortunate and driven enough to understand my learning disabilities, but I see bright boys flounder and can see it's due to the way they are being taught. It really is a shame.
In a way modern coding bootcamps that offer internship placements appear similar to me to the German-style vocational schooling (apprenticeships + classes).
I have a CS degree, but I find bootcamp graduates to have more practical knowledge for many positions right upon graduation. At least this is true speaking from my personal experience and what I have observed.
Most jobs in the US don't truly require a university degree (the knowledge isn't actually required) and I'd argue that most "coding jobs" (deliberate choice of words) don't either.
Yes but the problem is with the bootcamps I have dealt with in the UK, is that they are in a "bums on seats" philosophy.
They charge crazy money for a 3 month bootcamp, then say people are qualified to work as a software engineer. They then turn up at a company and are lacking in various areas.
Graduates are not very useful at the star, however they have a foundation and a good percentage have done an internship. So there's more to work with.
There needs to be standards implemented and companies need to get together to say "we expect x,y and z from boot camp graduates"
Otherwise, I could go an setup a bootcamp tomorrow and take peoples cash and just make them watch youtube videos of react.js
I'm doing an apprenticeship at a FAANG company; it's been an amazing experience so far. I feel like I've learned a hell of a lot more about software engineering from a few code reviews than sitting in lectures and reading textbooks (which don't work as well for me).
The value in university seems to be the brand, the networking opportunities and the environment, not the education.
This is a common thing to say because it's "cool" now but I don't think it's even a little bit true at all. The value of my University was almost entirely the education, I learned a massive amount and did crap tons of homework and projects (not just lectures and books). I was completely and entirely prepared for a good job in software development. I never even wrote "Hello World" before college.
I have a family member who signed up for an apprenticeship, but the waiting list was many years long. He was told that he could get higher on the list (or maybe skip the list entirely, can't recall exactly) by getting an associates degree. So he went to a community college to get the associates degree. Once he got the associates degree, however, he no longer needed the internship and went right into the workforce. I never asked him directly how much he makes, but I suspect it's more than I do because he's in a specialist field.
The value of my university was the interdisciplinary parts, forcing me to get exposed to things I would have ignored.
I think the application of computer science aficionados to ethics, biology, sound production or many other electives increases the mindshare and cross drift
Most of this is irrelevant to the private sector at static point in time. So the sheet of paper saying I went there and the brand is all icing on the cake.
No, it isn't a necessary credential to contribute to companies needing to build things with code. That was not always clear and may not be clear in the future.
I'm curious what CS/CE undergrad program consists only of sitting in lectures and reading textbooks. That sounds like it's missing 90% of the point of classes! I would think university had little value, too, if we didn't have to write any real software.
To the company, there's also the value of selection. Picking the right college grad to hire is very hard. Even harder, I'd imagine, if they hadn't even gone beyond a high school education yet.
Absolutely. My company's been experimenting with different ways to measure apprenticeship candidates than regular ones. I had a group task, a short project/presentation, and an easier version of a whiteboard interview. Trying to measure ability to learn was the main goal.
There are a bunch of comments which are conflating the lack of apprenticeships with the general recommendation of college as a basic requirement for young people, which doesn't make sense. You can do both: this is how doctors learn today: 4 year college followed by Residency of some sort. The same model could be replicated for other fields if they so demanded.
An equally plausible theory is that the American economy has moved away from primary, low-tech factory jobs to high-tech ones that require a more educated workforce, and the Education sector has responded to that demand. The other somewhat unrelated factor has been the explosion of student loan debt which exploited that demand for education for its own ends.
I would also speculate that having the experience of college and mixing and meeting with people from all over the country is a benefit that is often overlooked today; Americans that don't experience that lose out on an opportunity to meet and know people that they never would have otherwise.
Imagine what happens if you bring the baby to the bakery and have him or her grow up there. Everything would become overly obvious. Lets also imagine how answering questions improves your skill further. The baby will have you rethink everything, your very essence, to eventually rebel against your methods, your very being :)
Despite the title, the author isn't really saying there should be less classroom learning, only that whatever material is being learned should have a purpose-driven real world application tied into it.
Sure, depending on the specifics that could mean less classroom time as well, but that isn't a necessary result of the idea, just a possible one.
With big people of course the "apprenticeship beats classrooms" manifests in so many ways- everyone who "learns more in 4 months of work than in 4 years of school" is familiar with this.
I love (public) schools and academia and think on the whole they are treasured institutions, but there is a degree to which their presence and mode of operation leads to deprioritization of "doing-with", and this is very unfortunate.