The worst class I had in college was Software Engineering. It was the university's attempt to prepare us for the work force, and it was taught by an adjunct who had plenty of industry experience, but it was already a 10 years out of date.
Industry processes are mostly fads that change with wind. CS fundamentals however, are much more stable. 20 years from now knowledge of automta, graph theory, and complexity analysis will still have immense value--a scrum master certification won't.
I also had a Software Engineering class (actually two them) that were focused on how to build software in real life. This was in '03 and we covered things like waterfall methodology, requirements gathering, functional specs, etc. If taught the exact same way today it would be woefully out of date but the time we spent on requirements gathering (where the teacher or TA) pretended to be a product owner and purposely gave really crappy answers and we had to extract useful information but by bit was one of the best pieces of prep I ever received.
All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology. I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...
> I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...
I think you probably would have learned those lessons after a few months on the job. You probably wouldn't have picked up the knowledge to build your own OS on the job, however.
>All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology.
Did you learn about billing clients, and effectively advertising your services in your CS degree? What about equity versus salary tradeoffs?
At least 50% of working in technology is soft skills, why doesn't a CS degree spend 50% of the time teaching you those? The answer is that a CS degree isn't supposed to be vocational training.
Vocational training, including learning to talk to clients, should be done on the job, during an internship/apprenticeship, or in a specialized vocational training program. By including it in a college degree, employers have successfully pushed employee training costs onto workers and society (as the article argues).
The article isn't complaining about CS majors not getting jobs, it's about soft liberal arts majors not getting jobs.
When companies hired people for 30 year careers, they could afford to invest a tremendous amount in training. When they hire people for 2-3 years, they have less time to amortize the costs. And it's up to the employee to convince the companies that they can learn quickly, and on their own if need be.
Any CS major with decent grades and a positive attitude can learn anything in most any job. (Certainly CS, consulting, finance, marketing, even some kinds of sales) I can't say the same for liberal arts majors. There are great ones out there, but also a lot of folks who goofed off for 4 years and didn't learn anything.
You could take that one step further. CS has, in it's modern incarnation, only been around for ~90 years. CS, as it is taught at the undergraduate level, has changed dramatically in recent years, and will continue into the future.
Math, on the other hand, has been around for thousands of years, is relatively stable, and unlikely to become obsolete in the way a scrum certification, or even a machine learning algorithm, will.
Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.
>Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.
I agree with you completely. The parts of CS that are stable are the parts that are based on rigorous mathematical foundations. I think that teaching things like Object Oriented Programming strays too far from a rigorous foundation--away from math and even engineering into craft (which belongs in vocational training).
When I look back, the classes that I learned the most from were, Discrete Math, Automata, Design and Analysis of Algorithms, and Programming Language Concepts (which went into the academic side of programming language research more than what was currently in use in industry).
I mostly agree, but for me, the line gets blurry around the applied areas that have a lot of depth: computer architecture, operating systems, networking protocols, compilers, and databases. In all of those, I learned a lot about theory, practice, and engineering trade-offs, all of which was worthwhile. I didn't study it myself, but I would imagine distributed systems is (or should be) a similarly rich subject. I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.
Good point. Many of the area's you mentioned do have a lot of formal underpinnings, and there are large bodies of research to look to for guidance.
Computer architecture is big E Engineering, done by Computer Engineers for example. The networks class I took was also one of the most math heavy, and most of the book was supported by proofs. In addition, we spent the first half of databases working with only relational algebra.
If you look through a textbook on any of the subjects you mentioned, and compare it to say a book on design patterns, the distinction between math/engineering and craft is pretty clear.
>I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.
We went over the history in depth in my program--from Turing to Konrad Zuse to Backus. I also found it immensely useful.
My software engineering class spent most of the semester going over design patterns which are in fact quite useful to learn in school. And then maybe 1/4 of the class on going over the various development methodologies. I agree that a class devoted entirely to methodology would be complete over kill. However I think there is room for getting some exposure to it in school. Ideally before taking higher level classes, where having knowledge of a existing ways to structure your group work will be beneficial.
The problem is design patterns are subjective, they are craft, not science or engineering.
There hasn't been enough work serious research done on "Software Engineering" to call Engineering with a straight face. You can't point a whole stack of serious research to say the design pattern A is objectively better in situation X because Y and Z.
What you can say is that design pattern A is currently in vogue so you should probably use it, while design pattern B has fallen out of favor in industry, so you should avoid it.
That is something that belongs in a vocational training program or an internship/apprenticeship not in a university Computer Science education.
You're going too far when you dismiss design patterns as being merely fashion. Just because something is not objectively proven doesn't make it false. Things in the real world are not binary, where they are either objectively proven (hard sciences) or completely false ("The earth is flat"). In reality, lot of things are gray. Design patterns fall in that bucket — many of them help, as long as you remember that there are exceptions.
If many people learnt over and over again that global state, for example, leads to more bugs, you'd be wrong in completely dismissing it just because it isn't objectively proven. Because then you'd be arguing that a program that uses only global variables is just as good as one that's properly encapsulated and abstracted. Do you think anyone would take you seriously if said that?
I don't have a problem with design patterns as a concept. But you need to recognize them for what they are--folk wisdom. Some of it is useful, much of it isn't.
If there is no theory we have to fall back to empirical analysis, and unfortunately our industry hasn't done much of that. The only thing we have to go on is the general "consensus" of the industry, which is cyclical, transient and mostly fashion.
Some of the industry folk wisdom if beneficial and withstands the test of time. Most of us agree that encapsulation is nice. However, we don't agree on what form that encapsulation should take.
OO programmers argue that state should hidden away inside objects, functional programmers believe that state should be explicit and we should always try for pure functions and immutable data when possible. There's very little objective data to support either side (except that functional programming languages tend to have more formal underpinnings). Mostly it falls back to personal preference, which programmer sages you trust, and what the current industry fashion is.
Our industry reinvents the wheel time and again because we are slaves to the cyclical nature of the industry fashion.
Relational Algebra/calculus has been formalized for decades, yet people who don't understand relational theory cried out for something "simpler" and thus NoSQL was born. Fast forward 5 years and you'll find many of the people championing NoSQL had to reinvent most of the tough problems that CS had solved decades ago.
Again, there is nothing inherently wrong with craft and folk wisdom--just like there's nothing wrong with learning salary negotiation, but they don't belong in an academic CS environment. These things are best taught in an internship/apprenticeship after you've learned the underlying theory.
Design patterns are models, something very much within the wheelhouse of engineering. Engineering doesn't always deal with absolute facts as in science. When an underlying system is too complex to fully describe, simplified modeling can be appropriate.
Models are used to get a better of understanding of a complex system, or to test a complex system when you can't test the real thing. A design pattern could be used to this way but they aren't. They are used as a design methodology that you are encouraged to follow.
They also aren't based on any formal theory. They are based only on the experience of the people who create them. They are almost the definition of a craft (as opposed to engineering).
In an actual Engineering discipline you would need evidence to prove that your model fits reality as opposed to just trusting the experience of a few guys who wrote a book.
From my experience design patterns are both taught and used as a starting place for solving a problem which is recognized as similar to a problem which was previously solved effectively with in the manner of the pattern. Rarely will the design pattern fit perfectly as the solution, but by recognizing and using the correct one as a starting point for many architecture related problems you can greatly reduce the work involved, in creating the solution.
They keep people from re-inventing nearly identical solutions over and over, and allow for a vocabulary which can express rather complex ideas because fellow practitioners of software engineering generally know many of the same design patterns. This saves time and reduces the opportunity for miscommunication when expressing a more complex idea (assuming the person you are talking to doesn't lie and for instance say they are familiar with the facade pattern, when they are not).
To me this is using them to both get and express a better understanding of a complex system. I am also able to understand code for new projects I am to work on much quicker when I understand the underlying design of the elements. When the elements are designed with a structure that resembles things I am familiar with this process becomes fairly easy.
I see them as things like definitions of 'Suspension Bridge',
'Victorian House', 'Tunnel', or 'Dog House'. Sure we might see a book consisting of very rudimentary definitions of structures like these as being absurd, for the corresponding type of engineer, but that is because of our familiarity with the form these structures take. That familiarity with the form is the point of design patterns in my opinion. And I believe their utility is indispensable should we want to continue building more complex structures in code.
I have no problem with design patterns as a concept. And I have no problem with design patterns being taught as the collective folk wisdom of wise sages of industry.
However, design patterns are craft, not engineering, and should be taught as vocational training--not as an academic subject.
Design patterns have no rigorous underpinnings. They have very little academic research to back up their efficacy. The only "proof" we have that the design patterns being taught are beneficial is the word of a few guys who wrote a book and the collective folk wisdom of industry.
There is nothing wrong with this, but it isn't firm foundation for an academic subject.
Again we see another Dijkstra soldier pushing the corrosive "software programming is complex" ideology. Notice how they are the same group as Martin Fowler followers.
> "They keep people from re-inventing nearly identical solutions over and over"
No, they force users to rewrite the patterns over and over again, since a pattern is an idea that a given language is not powerful enough to abstract over. Cf. opinions on the C2 wiki.
Maybe that's why you think what we do is "too complex to fully describe" - you haven't used a language that allows you to abstract at will.
OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.
And in order to formulate that attack you pieced together quotes from two completely differently messages and people to make your 'point'.
I said they keep people from re-inventing solutions, however it was the comment you actually replied to which else claimed they were too complex.
I agree they are patterns which languages do not yet abstract over, but until languages do I think they are very useful patterns for programmers to be able to correctly recognize as they occur frequently enough that knowing the pattern saves a lot of time and work. When languages can successfully abstract over those patterns, then knowing them will be niche knowledge assuming new patterns cease to be detected by humans prior to their ability to be abstracted by a language.
> OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.
Please.
Rhetoric = Ethos + Logos + Pathos
Unless you want to believe I chose my words while extremely angry at you personally, and unless you want everyone to never use pathos and talk like robots in perfect clauses and rebuttals connected in an acyclic directed graph, I suggested you get used to this very common rhetorical necessity.
If college education is too specific to one company then you are locking yourself into that company and you will lose any leverage over salary or even leaving.
Oh god yes I went through the same pain. Our university decided that every Engineering student had to take two CS courses during the first year. This applied regardless of whether you were studying Civil Engineering, Electrical, Mechanical etc. This was alongside all the other first year courses we were required to take such as Chemistry, Physics, Calculus, Linear Algebra, Statics, Dynamics and all that.
First Semester we took Intro to Programming and Algorithms (CS1100 or something like that). We learnt everything from Binary notation, logic gates through to floating points, I went from not knowing any programming language to having a fair idea about how topics like recursion worked our last project was to write some code to walk a tree using recursion. I also learnt how to use Unix and recieved my lifelong love of Emacs (which we used in our tutorials) from this course.
Second Semester we had to take Software Engineering (CS 1110 I think). It covered waterfall model, version control, specifications, unit tests and more estoteric stuff like loop invariants and formal correctness. Our major project was to write an essay about the Arriane V rocket explosion. I really enjoyed CS 1100 class but CS 1110 effectively succeeded in boring me to tears. It was somehow supposed to give us a taste of real world software design. All it really did was encourage myself and many others not to take CS electives in later years.
My major was Materials Engineering I took mostly physics electives in the last year of my degree, which is kind of ironic now because I work pretty heavily with programs - my day job involves writing computer simulations. I learned most of the skills I need on the job (including the C programming language, databases/SQL etc.) I probably would have benefited from more formal training but my experience of university CS was so miserable I actively avoided it.
I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.
In college, I didn't learn version control, continuous integration (continuously submitting your work in small changelists or patches), unit testing, making sure you're building the right product before building it, delivering the simplest possible code and design that meets the requirements, code quality, working in teams, untangling dependencies and making as much progress as possible today without waiting for all your dependencies to be resolved, and so on.
I expect that all these skills will be very much relevant 20 years from now. So, don't confuse long-term value with "grounded in CS fundamentals". Programming isn't a hard science like physics.
>I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.
I didn't say that. I said most industry processes are fads. I also didn't say that nothing that isn't grounded in CS fundamentals has long term value.
There are plenty of other skills that have long term value. Office politics, salary negotiation, self-promotion are much more valuable than knowing how to run a few git commands. But non of those things should be taught in a Computer Science program.
They are fundamentally vocational skills. Just like version control, unit testing, and continuous integration are vocational skills. Sure they're useful but they should be taught in an internship/apprenticeship or on the job.
>In college, I didn't learn version control...
I learned to use subversion, and other than the fact that they are are both version control systems, what I learned didn't really carry over to distributed version control like Git.
In a CS program you should be learning things like how to implement a version control system, not how to use Git. I would have been pissed paying thousands of dollars per semester for a professor to walk me through a Git tutorial.
I don't have a problem if a professor wants you to use github to submit your assignments or something like that. And sure some of the vocational skills you listed are going to be useful for years to come. But these skills should be ancillary. They should be just a happy side effect--like learning teamwork during a group project.
I'm a practician, not a theorist or an academic. I couldn't care less whether something is based in CS theory. I care whether something will be useful to me over the course of my career. If it is, I'd like my education to train me for that.
In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.
That's perfectly fine. What you're looking for is vocational training, not a liberal college education. Non professional college programs are explicitly not vocational training. If they were, they wouldn't require spending nearly half your time on general education requirements (assuming we're talking about the US here). I doubt art history, physics, or psychology has proven much direct use to you in your career.
>In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.
Finite state machines and pushdown automata are an incredibly common pattern, and I can't see how you can work as a professional software developer without running into that pattern time and again. Have you never used regular expressions?
Automata (usually taught along with theory of computation) teaches you all kinds of useful real world knowledge, like why you can't parse HTML with regular expressions, and why you can't write a program to tell if another program will eventually halt.
My idea of education is one that teaches you skills that are broadly used throughout your career. I don't a priori reject things that meet this criteria just because it's not based in theory (because theory is not an end to itself), or by applying arbitrary labels like "vocational" (whatever that means), "liberal" or "professional".
As for art history and psychology, that's a different debate to be had about education — whether these should be part of education and how much time they should take.
As for your question, I've used regexes, but you don't need to understand the details of the regex engine in order to use them. Neither do I, in my day-to-day work, write programs that try to tell if other programs halt.
>but you don't need to understand the details of the regex engine in order to use them.
Yes, at some point you do. Without understanding how regular expressions actually work, you can't know when it is appropriate to use them. Many things aren't possible with regular expressions and many grammars aren't parsable with regular expressions. You can either waste time trying to write an impossible regex (or write one that works on your tests, but blows up in the wild) or you can study automata theory and understand what actually goes on underneath.
As for the halting problem, I'll leave you this stack overflow explanation for why it is beneficial to understand.
Many problems in CS have already been solved, some are impossible to solve. You can either waste time on trial and error trying to reinvent the wheel or you can study the theoretical underpinnings.
Do you want to spend a week trying to model a problem as a finite state machine, only to determine that finite state machine isn't powerful enough to solve your problem?
Do you want to spend a month banging your head against a wall trying to solve a problem that you could have solved in 5 minutes had you realized it was just a well known graph theory problem all along? A problem that was solved decades ago. The only way to know these things is to study the theory behind what you do.
Why do you think Civil Engineers are required to take physics? The difference between an Engineer and an artisan is a rigorous understanding of the formal system underpinning his work. Artisans build through trial and error and experiences, and they leave many failed projects in their wake while they gain this experience. Engineers use theory and modeling to limit the number of failed projects to the net benefit of everyone involved.
> My idea of education is one that teaches you skills that are broadly used throughout your career. ... applying arbitrary labels like "vocational" (whatever that means)
vocational - adj.
2. (of education or training) directed at a particular occupation and its skills.
Which is exactly your idea of education. The labels are descriptive, not arbitrary.
Then, do not go for a degree in Computer Science if that is all you want. I am sure there are cheaper and better ways to become a practician than going through college.
Becoming a software developer is just one of the possible career options after a CS degree.
I doubt that there are cheaper and better ways to become a practician than going through college. In India, where I live, most companies are not interested in you unless you have a degree. The normal path to a career in programming is to get a CS degree. And the normal outcome from a CS degree is a career in tech. So, they are much more closely related than you acknowledge, at least in my part of the world (things may be different in yours).
Yes, it happens also in Spain: companies want college degrees for their software factories, and at the same time complain that colleges do not teach anything useful.
I guess it depends on the purpose of the education. If you're already getting the CS fundamentals, maybe it doesn't hurt to get up to speed on some of the industry fads at the moment, since most of the students could be looking for an industry job in two-three years.
Everything is a tradeoff if you spend time on industry fads, you're spending less time on CS fundamentals.
Industry fads aren't useful outside of industry, so learning them should be paid for by industry. As the article argues, 20 years ago they were. Now employers are trying to push the cost of vocational training onto workers and the public.
Well, the whole point of the article is that those students looking for industry jobs in two-three years should just be hired and trained in the fads of the moment, instead of wasting higher-ed time teaching trivial and likely irrelevant-by-then knowledge.
Industry processes are mostly fads that change with wind. CS fundamentals however, are much more stable. 20 years from now knowledge of automta, graph theory, and complexity analysis will still have immense value--a scrum master certification won't.