Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m a fan of community college, but I’m not surprised that no companies reached out. It is rough to try and jump into an entry-level position after only two years of studying programming. If you want to be a programmer, the way community college fits into that plan is as the first half of a four-year program. You take intro programming classes in community college, transfer to a four-year college, and then take your 300-level and 400-level algorithms / compilers / operating systems / databases classes. You end up with significantly less student debt and the same credentials, although this is not without its drawbacks (mainly, you may transfer but end up needing three years of study instead of two; you have to really be on top of shit w.r.t. bureaucracy to de-risk this one).


I’m pretty sure I could do 95% of my job without the advanced classes I took. The important things that I learned since I started my career were not taught at university.


There's always someone who will reply with something along the lines of "I don't use anything I learned in school on the job" or "Software engineering is not about coding, it's about X"

With due respect, in my experience as someone who did not major in CS, it was a rough first few years because I had weak knowledge of data structures, almost no knowledge of computer networking (showed itself why managing containers etc), didn't know even the very basics of bash/linux.

Nowadays, when I work with CS majors vs non majors, it has noticeably been more difficult to get the latter group up to speed and productive. Of course there have been notable exceptions (there always are when it comes to people matters)


100% likely depends on what you’re doing but I work in FAANGY tech and if anything CS majors lag behind people with real work experience, any real work experience. DA&A is not relevant to writing a CRUD app or service. It’s not relevant to collaborating or getting things done.


If 100% of a job is easy enough for a fresh community college grad, then it is easy enough to be outsourced for 1/10th the cost.

The 5% of the job that is hard, is what you're paid for. It is what keeps the job onshore, and it is what keeps fresh community college grads from being hired into it.


Just because he said he didn’t need a big part of the university courses doesn’t mean that it’s easy. He said that he learned most of the things at the job, which I mostly agree with. For some domains it’s important to have the proper educational background though. But mostly it’s a different kind of difficult compared to how advanced courses are difficult.


Idk I have a lot more confidence because I know what I know, that is, fewer known unknowns, and I have high confidence that there aren't unknown unknowns lurking around corners. This helps me drive decision making across the org.


Ya but if you're really honest with yourself, did you learn what you needed to "drive decision making across the org" in Uni? You may have I guess, but it seems like you really only get that confidence or those skills after you start doing whatever it is that you do. It's hard to imagine how anything I learned or could have learned in Uni, would have at all helped enough with any of even the least complex tasks once getting started professionally, that I didn't learn in technical college or on my own prior. That said, I wasn't a FAANG go-getter trying for a spot that required complexity analysis or whatever, or a game developer with a necessary background in maths and memory. I feel like the stuff you learn in Uni that's valuable, becomes encountered in only some special cases, and even then there's probably better environments for actually learning it rather than just figuring out how to test well on the material.

I personally found the humanities and hard sciences a bit more compelling compared to any CS courses; viewing the CS testing and teaching methodology through the lens of having already been in industry, it pained me more than it might have otherwise had I been a naive impressionable student with no professional perspective. Sitting there writing out an ADT in Java with a pencil in a not large enough physical space, thinking "why did I decide to do this crap again?", but also not being able to respond to the artificial pressure of the classroom with a fabricated stress response.


I think I agree with a lot of what you said - non-CS classes can be fundamental to learning some job skills (like writing, communicating, and deep thought). However when I say that it helps me drive decision making, I mean that I am literally more confident in what I'm saying because I'm reasonably sure of the boundaries in my knowledge.

For example, I went to grad school and studied databases. On that topic, I probably know more than anyone I work with; so if I put forward a proposal that is related to DBs in any way, I can be quite sure that there are no surprising edges that will come back to bite me (i.e. no one will come along and say "well what about this?", at least from a technical perspective). That's because I know what I know, and there are things I don't know, but I also feel like I know what those are (e.g. I don't feel as comfortable with low-level programming, mostly because I actually skipped a machine organization class and never picked those skills up separately).


Same.

I feel like, outside of the work experience that was baked into my degree, I could've spent those 4 years and all that money in a much more productive way.

(Note: I don't regret it, I had a lot of fun, but in terms of becoming a professional I could've been much more efficient)

As a result, anyone who's come to me asking how they can "break into" programming, I strongly discourage them from University at all.


I was got a non-CS bachelors, and then managed to get into development, partly because I had been programming all my life. But I went back for a masters and shored up the undergrad CS that I missed. Boy was that helpful! It wasn't necessary for 95% of my job, either, but it enables me to solve the other 5% that I would be unable to solve, or end up with something that barely works. I've found compilers useful (for parsing; didn't have time to take finite state automata), the OS class was excellent for understanding and to be able to write threaded programs that work. Data structures is simply essential (but I took a that in undergrad).


I am a much better "engineer" for having taken the advanced classes, but more to the point, I was a much better programmer for having taken the advanced classes.

The total amount of code I had written in my first two years of university was not very much; by the time I had taken some of the advanced courses I had written a lot more code and had thus become a better programmer (still not a good one, but better!). I had written probably 10x more code by the end of year 4 than the end of year 2 thanks to practicum courses, etc.


Not sure about your school but part of college is a washout tool. Can someone handle BS for 4 years without giving up which is a good indicator of their character. Another is that most of the classes should eventually teach you how to teach yourself. You're given progressively harder challenges but you have to fill the gaps with your own study and research.


Bingo. A BS is a great filter for hard working and higher IQ.

Guess who does really really good in sales? People who play varsity or college sports. Is it a rule? Nope. Is it a great heuristic? Yep. Does football teach you sales? Not really... but it does filter for certain traits, one of which is probably pure animal magnetism and the base desire of humans to be influenced bt the most fit individual.


Team sports teach you how to work productively with people, including people you might not like. That skill directly transfers to sales.


And it almost certainly also filters for competitiveness, probably more so than the typical technical curriculum does.


I like to phrase it this way—in life, there’s a lot of bullshit. In college, there’s a lot of bullshit. If you can handle four years of bullshit in college, that’s a good sign.


The "washout" would work equally well if they were teaching non-BS.

And what do we teach people for 12 years before college? Is college the first time people learn how to learn?


The 12 years before is mandatory so it's no indication but it used to be when it wasn't. That's why high school diplomas don't matter as much as they did in say 1920


public school is just daycare, not a school at all, due to lowering the bar and easing curriculum in the name of "equity".


We could probably do a better job in the years leading up to the 16-18 year old start of going away to college. But we'd probably need to spend the money to customize things a lot more. One of the fun reads I had over the past year or so was from the NY Times on basically a one-room schoolhouse in Alta Utah where the kids basically spent half the day skiing. I can think of worse things.

https://www.nytimes.com/2024/04/21/us/alta-utah-public-schoo...


With AI what we going to see is unbundling of school: education separately, socialization separately.

You can socialize with your ski buddies. You can study with self-selected motivated group of teens together with AI and a tutor (college grad with AI teaching high schoolers, high schoolers with AI teaching mid schoolers, etc).

This model will be decentralized and more like twitch group, rather than centrally administered school District


That’s the problem. You can’t just eliminate pre-college and hope everything works out fine.

Because AI.


College wont be needed because of AI, the entirety of liberal arts is already obsolete.

Engineering disciplines can be split into blue collar (like operating CNC) learned at vocational schools, and deep tech (research focused) - these really only need “elite” colleges like top1%. The rest of 99% of colleges might as well disappear


The level of disconnect from reality you often see in tech circles is pretty amazing.


this is my vision of the future, given the current rapid pace of development of ai.

just ask yourself: are you ready to pay $300k (65k tuition+20k living costs per year) for each of your kid to go to college, party, get drunk and study data structures & algorithms and Java?

do you want to spend 300k so your kid understands Cormen Leiserson algorithms book and do you think it is a good value ?

and I am talking about CS - arguably one of the highest ROI degrees out there.

meanwhile a kid from India/Eastern Europe who got same CS degree for $10k out of pocket will get the same knowledge and can get the same faang job


I'm not sure what AI has to do with that question to be honest. The discussion upthread was about secondary education which is required at some level and generally free (to the student's family) in the US today. There are tons of resources for self-study and other activities for motivated students/families.

As for college/trade schools, options range from essentially free for self-study to very expensive with everything in between. Again, I'm not sure to what degree AI changes most of that. I'm certainly prepared to believe that credentialism becomes less important in some fields (it sort of has in CS) but that has less to do with AI than some companies relying less on universities to do their vetting for them.

Other fields, of course, differ. You've got a lot better shot at a top-paying job in Big Law if you went to Harvard Law and clerked for a Supreme Court justice.


My argument is that secondary education is not about education, since standards have dropped with the introduction of common core and other improvement in the name of "education equity".

If you want to send your kid to daycare, please go ahead and send to public school. But more and more parents opt for alternatives to public school: private schools, college prep programs, charter schools, extra-curricular Math and science classes, coaches/tutors and etc.

In the end it doesn't make sense to maintain bloated school system that does not accomplish anything, but being glorified childcare for teens, while all academic achievements are due to self-selection.

You can look at any top-rated high school and realize they are not doing anything much. They just have self-selection mechanism that filters the most motivated kids with strong parent support.


a kid from India/E̶a̶s̶t̶e̶r̶n̶ Europe got same CS degree for $10k

not even that much. just cost of living. so why not adopt the same model everywhere?


No degree here, 20 years in. Worked in embedded, medical robotics, AAA games...


No degree here, either. Simply not a problem for most work involving software and code.


100%. The tough let code style interviews mostly cover CS101 and CS102 topics that most of us stopped using the minute after matriculation.


University can offer two things:

1. Equip students to do a job.

2. Equip students to find a job.

These are surprisingly disparate, and I suspect the latter is a lot more significant than most people realize.

Put differently: I suspect the real challenge is getting hired without explicit qualifications.


I don't agree at all. You're describing a trade school. A university does not care if you get a job. They only care that they've satisfied your academic interests enough to stay in the profession or continue your education. They offer extra services to help you find a job but it's completely voluntary so I don't equate extra motivation with "equipping".


> A university does not care if you get a job.

Many report out this exact statistic. My alma mater claims 90% job placement for STEM grads.

> They offer extra services to help you find a job but it's completely voluntary so I don't equate extra motivation with "equipping".

They go out of their way to offer these services, but of course it all boils down to personal motivation. They hand students a tool and show them how it's used. That's my definition of "equipped."


And, at the risk of being totally cynical, they report out this statistic, care if you are successful in a noteworthy way, and make big contributions over time. (And may even care a bit if you are successful in an abstract sense.)


This.

It’s weird that people equate Computer Science to Software Engineering. TOTALLY different!

In fact, you could argue that you could do compsci bachelors and at the end come out knowing everything theoretically but not know how to actually code well. Compare this to a software engineer undergrad - the whole time you should be coding!


I'm not sure about your #2. Self-promotion is not necessary at university, since the profs receive mandatory submissions. Very different from a job search, putting yourself and your paperwork "out there" in a targeted way.


Replace „to find“ with „to be accepted“. There is still a lot of (stupid) company rules about hiring only people with an advanced degree.


I agree about classwork, but I think many universities put a lot of effort into placing their students. Job fairs, resume classes, internships, student research, etc.

Personally, I suck at self promotion. The career aids offered by my school made a world of difference.


Outside of internships, everything else is opt in.


Of course. Wait, you had required internships?


Some schools do have co-op programs of various sorts. But they're generally opt-in with various degrees of optionality.


Actually a university does neither. A university's job is to expose you to higher order thinking, and different ways of thinking. The rest is up to you.


Obviously it comes down to the motivation of the student, but that does not mean university does not impact your employability, particularly in your early 20s. Documented exposure to higher order thinking is an employable skill.


Agreed, but the OP's claim wasn't whether a university degree is needed for employment, the claim was whether a university trains you for a job. The answer to the latter is a no. Community colleges and trade schools train you for a job. Universities give you an education in a subject area, the rest is up to you.


My claim was that university "equips you to find a job" not "trains you for a job." I view that as an important distinction!


Or, maybe, you learned way more than you think and that let you concentrate on the «important things».


That's not true at all.

Our industry is swimming in auto-didacts, dropouts, bootcampers, and community college graduates. There's no licensure and most industry-adjacent academic programs spend a lot of time teaching concepts and abstractions far divorced from everyday industry needs. Committed tinkers who are ready to get their hands dirty and start learning the trade are historically very welcome.

However, the job market is upside down right now, because we had an insane hiring bubble that has since burst, leaving countless people with real work experience in the candidate pool for jobs at all levels.

In this kind of market, which we're used to seeing come and go, there's temporarily a lower-than-usual demand for candidates with less vetting, and so the auto-didacts/etc can expect to have a hard time getting attention for a bit.

A lack of interest in community college graduates is an ephemeral market state that we're familiar with seeing sometimes, and definitely not a structural detail of the industry.


I think that's probably fair. I know very qualified people who are having trouble finding jobs. I don't personally care all that much but a number of opportunities post semi-retirement haven't really materialized. Which is fine but would probably have been different in different times.


When Seattle built their light rail system they ran it past UW and nearly every community college in the region.

It was already a known phenomenon for UW students to take CC classes to reduce the cost of genEd requirements but after the trains that just got easier. Much faster than the busses.


I worked in the CC system and I am a huge advocate to what they bring to market--a low cost alternative to primary institutions, cutting the high cost of a Bachelors degree by at least 1/3; but, those CCs were primarily focused on getting students to take their generals (call it what it is: an Undecided major). While this is great and all, most companies are expecting even their interns to have a 'hit the ground running' ability; right or wrong.

The only way for ANY college student to do this is through real-world experience (from an organization that is willing to take on someone very green or private repo work/portfolio). I am regularly asked by randoms on LinkedIn and/or my network as well as interns and/or candidates what they can be doing to improve their chances to get an internship or FT role. In those instances, I always recommend that these folks:

1. Come up with a project, end to end.

2. Develop the code and post it in a repo that can be accessed by companies during the hiring process.

3. Be able to explain the why behind the project, the entire process end to end, and--most importantly--why the work was important to do.

These are the items I believe will put those green students toward the top 20% of their competition in the market. I find entirely too many candidates who have a lot of coding and theory courses and not a whole lot of application of those learnings. By being able to show management of a project from end to end, be able to explain the development process coherently, and provide the potential business value it brings is a big way for those folks to stand out from someone who can just write code.

Many average four-year colleges aren't doing this for their students; CCs most definitely aren't.


Expecting college students to hit the ground running with whatever stack a company is running is bad and unreasonable and we should push back against it. Engineering, law, accounting, and medicine dont do this. Why should Computer Science? Dont reduce a university education into learning javascript frameworks.


oversaturated job market allows companies to demand this


I guess. I was never a programmer and I guess my last job was pretty nych hit the ground running--at least I was told I did. But not sure how much it's the norm.


I went to the University of Illinois. That’s a solid tech school pretty much across the board, often cracking the top five in a number of subjects, including CS.

But the real secret of that school was that, at least when I was there, they had too much computer hardware for the number of students.

The number of hours in a day you could show up in a lab and not have a waitlist was fairly generous. Which meant a lot of the smarter students had no problem working on their own pet projects.

From a “hit the ground running” standpoint, a lot of these kids were only bothering with a 3.3-3.9 GPA because time spent getting three more points on a test or homework was time lost from working on your project, or talking to people about theirs.


> I worked in the CC system; but, they were primarily focused on getting students to take their generals (call it what it is: an Undecided major). While this is great and all, most companies are expecting even their interns to have a 'hit the ground running' ability; right or wrong.

I haven't worked in the CC system, but I've been around the Belleuve/Seattle area CCs for many years.

And at least for those, there are broadly 2 tracks. The first is what you say - the first 2 years of a bachelor's degree track.

The 2nd track is a variety of certifications. Sometimes these are industry certs like Microsoft, Cisco, etc. And some of them are CC certs. But they aim to get people productive in 2 years.


Okay, but it's plenty enough for a paid internship. Many of these students are going to transfer to a 4-year after this, so I don't know why this would be different.


I think it's just rough no matter what. I've worked with a lot of junior devs and the thing is I haven't noticed that the 24 year olds with a fresh CS masters do particularly better than code school grads who were working at walgreens or whatever the previous year.

Software development is a discipline in its own right that only very partially overlaps with a university CS curriculum.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: