I have a feeling that the lack of faith that Americans have in their government ends up being a self-fulfilling prophecy. I work closely with a lot of government employees, and it really seems like 95% of the work is about avoiding "boondoggles" -- i.e. highly visible failures. This is because boondoggles can be used as political tools, and ultimately, politicians are more concerned about getting reelected than getting anything done. The upshot of this, is that they would rather risk low visibility failures way more than high visibility failures. As an example, about 10 years ago, they were looking to install wifi in one of the buildings that I work in so they bought a bunch of routers. One of the higher ups wanted to assure that this wouldn't cause any security issues so they ordered audit after audit after audit. Eventually they just stopped trying and the building still doesn't have wifi, and they wasted all the money on those routers and audits. The problem is that they don't even have any sensitive data or data to be secured on that network. It's like this for everything. I think a better system would be to tolerate some very visible failures vs basically guaranteed non-visible failures.
Federal receipts as a percentage of GDP have been basically the same since WWII. (Before that they were significantly less.) One of the reasons for this is that we started spending around the "maximum size of government" level during WWII for obvious reasons, i.e. the point past which raising taxes doesn't increase government revenues because they're offset by shrinking the economy, and never receded from there because there was never anyone who could cause it to stop and had the incentive to. What changes is really only who gets the money.
So all the spending is effectively zero sum. The real tax rate never really goes up or down very much, so if you want to do something, you have to find something else to not do.
All of the incentives then fall directly out of that. If you're getting money and nobody is paying attention to you then the most important thing is to have them continue to not pay attention to you so that you can keep getting the money. Meanwhile, if you want to get money for something, find someone else to make look bad so you can justify taking it from them.
Obviously this doesn't produce good results, but the problem is structural. The populist reforms made in the first half of the 20th century deleted all of the checks and balances on federal spending (compare how much of the budget of EU countries is the EU itself), which requires government programs to compete based on political power rather than merit because you can't just say that a program is worth the money, you have to find something else to displace whose advocates are weak enough to defeat because the trough is already full. Which has little to do with whether your program is better than theirs.
Imagine if we spent half as much money but the reduction came out of the likes of ludicrous military boondoggles and de facto subsidies for pharma companies.
Sounds like how I start projects. Try and figure out every last detail, get hung up on doing all this preparation, learning, etc and eventually give up because I’m tired of the process or things aren’t turning out as perfect as I’d like them.
Recently I’ve been trying to follow the ‘move fast and break things’ mentality, albeit more of a ‘just try moving a bit and things breaking is ok’ mentality.
I don't disagree completely with this, but just want to point out that it's kind of a bad smell to have computational biologists who are - as someone in the article puts it - computationally illiterate. I have met lots of these types over the years, and usually their methods are kind of a gong show. If you can't properly sanitize your data inputs on your column headers, why should I trust that you've treated the rest of your data properly?
I have a strong feeling that, if people really put an effort into reading and replicating more papers, we would find that a lot of what's being published is simply meaningless.
In grad school I had a subletting roommate for a while who was writing code to match some experimental data with a model. He showed me his model. It was quite literally making random combinations of various trigonometric functions, absolute value, logarithms, polynomials, exponents, etc. into equations that were like a whole page long and just wiggling them around. He was convinced that he was on a path to a revolution in understanding the functional form of his (biological) data, and I believe his research PI was onboard.
I guess "overfitted" never made it into the curriculum.
> It was quite literally making random combinations of various trigonometric functions, absolute value, logarithms, polynomials, exponents, etc. into equations that were like a whole page long and just wiggling them around.
Technically, we call that a "neural network". Or "AI".
I work in computational materials science (where ML brings funding) and a funny paper of this kind is here: https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.11... - they are literally trying out 100000s of possible combinations by brute force, to build a "physical model".
Then they go on conferences and brag about it, because they have to (otr they know it's bs).
Datasets are soso (you can have a look at QM9...) and for more specialized things, people generally don't bother trying to benchmark or compare their results on a common reference. It's just something new...
And with all that: even without doing fancy statistical methods without knowing too much about it, your theoretical computations might not make so much sense (at least in the sheer number which is pumped out and published)...
Oh, I thought "on the real" fit the context better, meaning they knew in their heart of hearts it was bullshit, but "off the record" is about the same.
> I have a strong feeling that, if people really put an effort into reading and replicating more papers, we would find that a lot of what's being published is simply meaningless.
People have figured that out long ago [1] (I know the author of that paper lately turned somewhat controversial, but that doesn't change his findings). It's not very widely known in the general public. But if you understand some basic issues like p-hacking and publication bias and combine that with the knowledge that most scientific fields don't do anything about these issues, there can hardly be any doubt that a lot of research is rubbish.
Yeah, but one would hope that science has a higher standard. 80% garbage results in science sounds catastrophic to our understanding of the world, and in particular when it comes to making policies based on that science.
There's the saying "science advances one funeral at at time."
'‘A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.’ This principle was famously laid out by German theoretical physicist Max Planck in 1950 and it turns out that he was right, according to a new study.'
Also the story of Ignaz Semmelweis who discovered that if doctors washed their hands it reduced deaths during childbirth - but for a variety of reasons his findings were resisted.
In grad school I had a friend that was doing olfactory (smell) research on rats with tetrode drives (wires in their brain). He was looking at the neuronal response to smells that they gave the rats and had a few signals to match up. There was the signal from the arduino running the scent gates, the amps that reported the weak neuronal currents, the nose lasers that gated the ardunio, etc. He was having a hard time getting through all the data in his MatLab code and I offered to help for some beer.
After the 11th nested 'if' statement, I upped the request to a case of beer. I'm not certain he ever got the code working.
To the larger point, scientists are not programmers. They got into their programs to do research. What keeps them going is not the joy of programming, but the thrill of discovery. Programming is nothing but a means to an end. One they will do the bare minimum to get working. Asking hyper stressed out grad students to also become expert coders isn't reasonable.
And yes, that means that the code is suspect at best. If you load the code on to another computer, make sure you can defenestrate that computer with ease, do not use your home device.
I keep seeing this sentiment when it comes to those in the natural sciences, but it makes no sense.
I could replace "programming" in your above little bit with "mathematics" and it would be just as weird.
Our modern world runs on computers and programs, just as our modern world and modern science built itself on mathematics and required many to use it. So too the new world of science may require everyone to know to program just as they know about the chemical composition of smells, or the particulars of differential equations, etc.
And I know your argument isn't "they shouldn't learn programming", but honestly since I keep seeing this same line of reasoning, I can't help but feel that is ultimately the real reasoning being espoused.
Science is getting harder, and its requirements to competently "find the exciting things" raises the bar each time. I don't see this as a bad thing. To the contrary, it means we are getting to more and more interesting and in-depth discoveries that require more than one discipline and specialty, which ultimately means more cross-functional science that has larger and deeper impacts.
Again: these are tools that are means to an end. They only need to work well enough to get the researcher to that end.
A lot of what are considered essential practices by expert programmers are conventions centered around long-term productivity in programming. You can get a right answer out of a computer without following those conventions. Lots of people did back in the day before these conventions were created.
That's not to say that everybody with horrible code is getting the right answers out of it. I'm sure many people are screwing up! My point is just that ugly code does not automatically produce wrong answers just because it is ugly.
By analogy, I'm sure any carpenter would be horrified at how I built my kayak rack. But it's been holding up kayaks for 10 years and really, that's all it needs to do.
I will add that in general, statistical analysis of data is not by itself adequate for scientific theory--no matter how sophisticated the software is. You need explanatory causal mechanisms as well, which are discovered by humans through experimentation and analysis.
And you can do science very well with just the latter. Every grand scientific theory we have available to us today was created without good programming ability, or really the use of computers at all. Many were created using minimal math, for example evolution by natural selection, or plate tectonics. Even in physics, Einstein came up with relativity first, and only then went and learned the math to describe it.
Your point is maybe a little obtuse to me, because it sounds like you are arguing for "computers are tools that should be learned, but really no one does and who can blame them, they just want to science" and simultaneously arguing, "tools aren't science, and science can be done without them".
I feel like the later is obvious: of course the tools aren't science, but if you want to do real work and real science, your tools are going to be crucial for establishing measurements, repeatability, and sharing how one models their hypothesis onto real world mechanics.
Likewise, the former is just the same commonly repeated thing I just argued against and my reply is the same: so what? You building a kayak is not science and is irrelevant.
Scientists can't reach a meaningful conclusion without proper use of tools. All they can do is hypthesize, which is certainly a portion of science (and many fields are in fact stuck in this exact stage, unable to get further and come to grounded conclusions), but it is not the end-all of science, and getting to the end in the modern day science means knowing to program.
Of course there are exceptions and limitations and "good enough". No one is arguing that. The argument I am refuting is those who think "tools are just tools, who cares, I just want my science". That is the poor attitude that makes no sense to me.
> Scientists can't reach a meaningful conclusion without proper use of tools.
I'm just trying to make the point that "proper" is subjective. Software developers evaluate the quality of code according to how well it adheres to well-established coding practices, but those practices were established to address long-term issues like maintainability and security, not whether the software produces the right answer.
You can get the right answer out of software even if the code is ugly and hacky, and for a lot of scientific research, the answer is all that matters.
The usual reason programmers object to ugly, hacky code is that it's a lot harder to be justifiably confident that such code actually does produce the right answer -- "garbage in, garbage out" is just as true in function position as it is in argument position.
Tbh I think its a case for multidisciplinary research. You wouldn’t only hire one skill set to run a company, even a tech one, so why should research be any different? That’s probably where the deep insights are.
People that are just decent programmers can make at least twice (probable 3 or 4 times) as much money working for industry than for science in an academic environment. Most programmers that would work for less money because they are interested in science will be more interested in computer programming problems than basic programming to support a scientist. NSF won't give you $250k to hire a really good programmer to support your gene analysis project. More like 100k if you are lucky.
So what you end up with are that great scientists that are decent programmers are the ones who can do the cutting edge science at the moment.
Think of the flip side: Programmers are terrible biologists.
Sure, it would be great if we all had more time to learn how to code. Coding is important. But I'd say the onus should be on coders to build better tools and documentation so they are empowering people to do something other than code, rather than reduce everything to a coding exercise because making everything look like code means less boring documentation and UX work for coders.
I mean, biology is in fact a full on degree program and you pretty much need a PhD before you're defining an original research topic. It's not because biologists are dumber and learn slower. It's that biology is complicated and poorly understood, and it takes years to learn.
Contrast this to coding... you don't even need to go to college to launch a successful software product, and the average person can became proficient after a few years of dedicated study. However, this is a few years that biologists don't have, as their PhDs are already some of the longest time-wise to finish.
The decision to rename genomes is totally consistent with the biologists MO: if a cell won't grow in a given set of conditions, change the conditions. Sure we can CRISPR edit the genes to modify a cell to to grow in a set of conditions, but if it's usually far easier to just change the temperature or growth media than to edit a cell's DNA.
My take away is that this is more a failure of programmers and/or a failure of their managers to guide the programmers to make tools for biologists, than of biologists to learn programming. Sure, coders get paid more, but they aren't going to cure cancer or make a vaccine for covid-19 without a biologist somewhere in the equation. And I'm glad the biologists developing vaccines today are doing biology, and not held up in their degree programs learning how to code!
MatLab has taken over bio specifically because it has great documentation and examples. If Python was psuedo-code that compiles, then MatLab is just speaking English. Even still, the spaghetti that researchers get into is just insane.
> To the larger point, scientists are not programmers. They got into their programs to do research.
I would say most research, to an ever growing degree, is so heavily dependent on software that it's tough to make that claim anymore. It makes no sense to me. It's like saying Zillow doesn't need software engineers because they are in the Real Estate business, not the software business.
I maybe misspoke. I meant that scientists do not go into science to program, they go into it to discover and do research (among many many other things). Sure, some do find joy in good programming, but that's not why they are there to begin with. Becoming a better programmer isn't their passion, and those skills remain underdeveloped as a result.
> To the larger point, scientists are not programmers.
I mean, sort of. Some research is essentially just programming; other research can get by with nothing but excel. Regardless, it's unreasonable to ask most scientists to be expert programmers -- most aren't building libraries that need to be maintained for years. If they do code, they're usually just writing one-shot programs to solve a single problem, and nobody else is likely to look at that code anyway.
There are lots of great computational biologists, but being a computational biologist doesn't necessitate being good with computers. Plenty of PI's rely pretty much exclusively on grad students and post-docs to run all their analyses.
Not that I'm saying using excel is bad either. I use excel plenty to look at data. But scientists need to know how to use the tools that they have.
If people are just looking at the spreadsheets then wouldn’t the cells interpreted as dates not be a problem? It seems like it would only be a problem if you’re doing computation on the cells.
It's also my experience of research in biological sciences that it is a widespread belief/fact that in order to get published in a top journal, the analysis methods must be "fancy", for example involving sophisticated statistical techniques. I worked on computational statistical methods so I'm not against that per se, but the problem is that if you have the training to contribute at the research front of an area of biology you rarely have the training to understand the statistics. Some would say that the collaborative publication model is the solution to that, but in many cases the end result isn't what one would hope for. I do think that less emphasis on "fancy" statistics, and more emphasis on simple data visualizations and common sense analyses would be a good thing.
I'm an ex-computation biologist who did most of his work in python but periodically had to interop with excel.
THe basic assumption I have is that when I input data into a system, it will not translate things, expecially according to ad-hoc rules from another domain, unless I explicitly ask it to do so.
It's not clear what data input sanitization would mean in this case; date support like this in Excel is deeply embedded in the product and nobody reads the documentation of Excel to learn how it works.
it would be nice if everyone was expert at everything, but they cant be. it would be nice if they hired experts but money doesn’t grow on trees. we often insist on a degree of excellence we refuse to pay for
It's not about being an expert at everything or hiring more people. These aren't particularly hard problems, it's not difficult to find biologists who are incredibly adept at using python, R or C. It's about thinking about how science gets funded and how it gets implemented. I've written here before about the difference between "grant work" and "grunt work", and how too computer touching tends to get looked down upon at a certain level.
If you're deciding who gets a large-scale computational biology grant, and you're choosing between a senior researcher with 5000 publications with a broad scope, and a more junior researcher with 500 publications and a more compuationally focused scope, most committees choose the senior researcher. However, the senior researcher might not know anything about computers, or they may have been trained in the 70's or 80's where the problems of computing were fundamentally different.
So you get someone leading a multi-million dollar project who fundamentally knows nothing about the methods of that project. They don't know how to scope things, how to get past roadblocks, who to hire, etc.
What's your source on it not being difficult to find biologists who are adept at using python, R, or C? Most biologists operating in private industry or academia have many years of training in their fields and many have learned their computational tools as they've gone on, meaning they've never received proper training. It seems dubious to claim that there's this neverending source of well trained biologists who are also adept at programming.
I would say the number of biologists who actually understand programming is extremely small. I've been programming for fun for ~15 years, and I'm about to finish a PhD in chemical biology (e.g. I started programming in C far before I started learning biology).
You might occasionally run into someone who is passable - at best - with R or Python. But most of the code they might write is going to be extremely linear, and I doubt they understand software architecture or control flow at all.
I don't know any biologists who program for fun like me (currently writing a compiler in Rust).
To be fair, linear code is often totally sufficient for most types of data analysis. Biologists don't really need to understand design patterns or polymorphism, they just need to not make computational mistakes when transforming the data.
Absolutely. My point was more than you can't expect comp. biologists to actually be "good" programmers when compared to SWE or even web devs.
Most of the code I write to do biological data analysis is fairly linear. However, I also generally use a static type system and modularity to help ensure correctness.
I've perused a lot of code written by scientists, and they could certainly learn to use functions, descriptively name variables, use type systems and just aspire to write better code. I just saw a paper published in Science had to issue a revision because they found a bug in their analysis code after publication that changed most of their downstream analysis.
It does get rather problematic when you have large quantities of...stuff. You can't run linear stuff in parallel so now you're bound to whatever single CPU core you have lying around.
I'd say that getting some basic data science computing skills should be more important than the silly SPSS courses they hand out. Once you have at least baseline Jupyter (or Databricks) skills you suddenly have the possibility to do actual high performance work instead of grinding for gruntwork. But at that point the question becomes: do the people involved even want that.
I write 'one off' programs all the time. Most of what I write I throw away, and I program for a living. Those are usually fairly linear. Which is fine. If I am writing something that will be re-used in 6 different ways and on a 5 person team. That is when you get out the programming methodologies. It is usually fairly obvious to me when you need to do it. For someone who does not do it all the time. They may not know 'hey stop you have a code mess'.
It one of the reasons why people end up with spreadsheets. Most of their data is giant tables of data. Excel does very well at that. It has a built in programming language that is not great but not totally terrible either. Sometimes all you need is a graph of a particular type. Paste the data in, highlight what you want, use the built in graph tools. No real coding needed. It is also a tool that is easy to mismanage if you do not know the quirks of its math.
It doesn't take being an expert at Excel to understand how Excel autoformats. It takes a few days of actually working with data or an introductory class that's today taught in American primary schools.
Sorry for asking but are you familiar with how MS Excel aggressively converts data to dates? There's no way to "sanitize" it (without resorting to hacky solutions like including extra characters) and even if you fix the data, it will re-change them to dates the next time you open the file.
I'm only familiar with LibreOffice and not Excel myself, but: if you want to be sure a column is treated as text in a CSV file, you have LO quote text fields on save, and have it treat quoted fields as text on import. I assume Excel must have similar options.
For the most part we aren't talking about computational biologists but experimentalists using Excel. People at the bench need to collect their data somehow, and using Excel for tabular data and Word for text data is just what they know. Typically they then pass these files over to computational biologists for analysis. Yes, it would be nice if they would use more appropriate tools, but I know from experience that the typical result of trying to teach them better tools is the experimentalists just rolling their eyes and saying that they don't have time to learn some nerdy program because they have experiments to run.
Considering how Perl was chosen as the computational biologists lingua franca in 1990's - 2k's since it was good at text manipulation (since genes are represented by text) I would say they don't have a history of making good choices.
> What might work is that we require replication work for a PhD
I don't think this will work. All it will do is devalue the value of replication studies because only PHD students do replication studies. It's also not in their best interest especially if they dispute findings of established researchers.
Also, we have to get away from the idea that the scientist's job is to think and write, and literally all of the other work can be shuffled off onto low wage (or no wage), low status workers. This is one of the biggest reasons that science is going through such a crisis. If you want enough papers to consistently get grants you probably need at least 4/5 PHD students every few years. This causes a massive glut in the job market. It also dissociates scientists from their work. I've met esteemed computational biologists who could barely work a computer. All of their code was written, run, and analyzed by graduate students or post docs. They were competent enough at statistics, but that level of abstraction from the actual work is troubling.
Requiring replication work for a PhD seems like a great idea. PhD programs already use a mandatory exercise—the qualifying exam—to check a student's competence, with ambiguous effectiveness. Turning the qualifying exam into a replication study seems like a win: it tests the student's ability to do their actual job rather than pass an abstract test, and produces output that is useful both to the student and the community. The qualifying exam committee (usually ~ 4 PIs from different labs) can do quality control on the replication.
> All it will do is devalue the value of replication studies because only PHD students do replication studies. It's also not in their best interest especially if they dispute findings of established researchers.
Most studies are done by students regardless, so it seems unlikely that replication studies would be devalued merely because they're done by students. Although disputing the findings of established researchers can be risky, they would be publishing jointly with their PI (or, with the above implementation, multiple PIs), not alone with no support. Few students want to stay in academia, so it usually doesn't matter to them if a professor at some other institution gets offended. Most importantly, if everyone is doing replication studies, there will be so many disputations flying around that any particular person is less likely to be singled out for retaliation.
It sounds like what you're suggesting would be functionally equivalent to PI-led replications, which I would agree is a good idea. There are still some practical problems though.
1. Studies can be much more expensive than most people think. In my field, a moderately sized study can easily cost $100,000+ if you're only accounting for up front cost (e.g. use of equipment, compensating participants). Someone would have to foot the costs of this.
2. Studies can be incredibly labor-intensive. PI's can get away with running studies that require thousands of man-hours because they have a captive market of PHD students, Post-docs, and research assistants all willing to work for low wages or for free. PHD students usually don't have the same amount of man-power.
3. For obvious reasons, studies that require high cost, high man-power work tend to get replicated naturally less. In other words, the least practical studies to replicate happen to also be the most necessary to replicate.
A couple of things I would dispute:
> it seems unlikely that replication studies would be devalued merely because they're done by students
I think academics value work in a particularly skewed way. There is "grant work" and there is "grunt work". Grant work is anything that actively contributes to getting grants for one's institution. Grunt work is everything else. PHD's can do grunt work, but that doesn't mean it will be valued on the job market. For example, software development is actively sought after in (biology) grad students, because it's a very useful skill. However, I've also seen it count against applications as professors because it shows they spent too much time on "grunt work". Software development skills don't win grants.
> Few students want to stay in academia
In some fields there aren't any options except to stay in academia or academia adjacent fields.
Not OP here, but my issue with the recommendations are that they've pretty accurately listed a whole bunch of mostly structural problems with academia, but all of the suggestions boil down to "we all just need to try harder". You can say something like: "journals need to demand higher standards" but what incentive do they actually have to do so? Then you can counter with "scientists could vote with their feet", but what incentives do they have to do that?? You're asking people to consider seriously damaging their career for some nebulous quality metric.
Frankly, having worked in academia long enough to see at least a couple shifts in culture, the only thing I can see that comes out of this is a couple more things get added on to the ever growing checklist of publishing a paper/submitting a grant application.
I think we need to get away from the sort of thinking where large structural problems can be solved by tiny incremental improvements. If you really want to solve the problem, one or more of [Granting Agencies|Journals|Universities] has to be completely torn down and built back up.
It seems to me, still, that a lot of these problems you bring up can be addressed by universities changing their hiring policies. Which makes sense: academics ultimately rely on universities for their income, and so it is the hiring policies which are setting the perverse incentives. And I don’t think changing hiring policies would be an incremental change, it would be a huge change (and not likely to be made by any university any time soon, since students rank universities on similar metrics to how universities hire staff — a prestigious university will lose prestige even if it changes its hiring policies for the better).
> academics ultimately rely on universities for their income
Sort of, a huge portion of income is from grants, particularly after the first few years from being hired. More importantly, a huge portion of the University income is from grants. When a researcher recieves a grant, there is an "overhead" percentage that goes to the University. Universities hire, in part, to maximize those overheads, which means getting the researchers with the best chance at getting big grants.
Changing the hiring process may affect how PHD students act, but once they're "in the system", they are subject to all the same problematic incentives.
> one or more of [Granting Agencies|Journals|Universities] has to be completely torn down and built back up
right, and unless the new institutions are in a financial vacuum, they will remain built on and affected by broader systems, resulting in conflict of interest.
> One silver lining of living in these United States is that you can be assured that complex technology picked up by cities and counties (and often even states) will be implemented carelessly and wielded by only the hammiest fists.
This is absolutely not assuring. All this means is that the wrong people are going to be surveilled, arrested, and punished.
As someone who works with a lot of state IT people, if they don't know how to hire programmers, they probably don't know how to manage programmers. In a bureaucracy, if the administrators are making these kinds of unrealistic hiring demands in a crisis, it probably means that the administrators outrank all programmers and don't trust them to make their own decisions. This is not an environment to work in.
This is a crisis, not a normal time. Fixing this system will make the difference between real people getting unemployment checks or nothing. That's worth enduring a bad boss for.
Problem is, that's how they get away with everything.
They fix nothing, then wait for somebody to need them enough, or have compassion enough to handle their mess.
Basically here, they are saying: we didn't listen to all those competent people that told us we should have cleaned this system a long time ago because one day it will bite us back. We didn't think technical debt was a thing. It has always worked. And it's virtual. And IT is always exaggerating anyway.
Now it does bite them back, and do they take responsibility ?
No. They ask for competent people to have mercy and save the day because the situation is dire. Because this is special this time, unlike the other times.
This is completely not like with the health care system that never got fixed and now the medical personal is paying for it so that the people don't suffer as much from this mismanagement.
They will not pay more. They will no fix anything. They will not respect you more, change the way things are done, or take any responsibility for this. And in fact they will probably guilt trip you into give up your health to overwork on this with less resources and for less pay because it's an emergency.
A friend of mine works in an hospital right now. She has been at work for 15 days straight. They already told her she won't get paid for the days she was not supposed to work but did.
Then as a final fuck you, if this turns out ok, if the catastrophe is avoided, they will get the reward.
The nurses or the programmers ? They will be forgotten in a week. Nothing will change.
But for the mismanagers, their superior will thank them for managing this situation well. They may even get promoted and have more responsibility.
So you have the choice between saving the money of poor people now and reinforcing the suck of the whole system, of letting people suffer for a chance of a redo and purging the ones that lead to this.
This is a terrible choice to have to make.
Especially since we have the empathy to make us discuss this, while clearly they don't.
No, it's not. These state IT systems and this habit that leads to their inevitable downfall needs to die.
They respect nothing. Scope? No, we need to fulfill every need and some. Budget? No, of course we don't have cash, and we have to be very careful, it's the taxpayers' money! Time? It's already late! (Yet they weren't able to get it done in 10+ years.)
They have no real competency, they don't even have competency to delegate this to someone competent. And they lack the competency to manage their own inconsistency regarding these issues.
The linked tweet thread is a perfect example of this. (Rampant project mismanagement; too big to fail; they introduced some Hadoop scoring system to match inconsistent records and whatnot instead of simply throwing out all the bad data and handing the rest separately - eg hiring a bunch of unemployed people to go over them.)
A state that should be at the top of any kind of project management and procurement hierarchies wasn't able to supervise an IT system project. It's so incompetent just thinking about it will lead to spontaneous combustion.
Yup. Exactly right. And they think the 26 year old outside consultant knows how to fix the problem better than the 50 year old government programmer, who management has spent 17 years beating down their self respect.
Consistently assigning work to big firms who consistently screw up... All they want to do is show that "they're trying", and the charade will continue until someone does get fired for buying IBN.
Like off-shored protective medical gear, off-shored pharmaceuticals manufacturing, and now a country full of unskilled 20 somethings that can only do phone apps and coffee baristas.
Companies will not like it, consumers will not like it,
State budgets will not like it,
But like cobol programmers, we need to learn how to make things and build our own supply chain.
My old employer did not listen last year when I retired. Nobody was assigned to make an annual change to a REXX program. I learned it on my own because it predated me.
But the process was so rigorous because we had to be our own RACF Admin to make our own RACF I’d to get to the datasets to change, then run the programs, but they got rid of printers connected to the mainframe and tighter up FTP so much that if took days to refresh passwords and ftp print files to your pc hard drive.
Then run programs in C# that scalped off the first column and read the mainframe page skips....
Am I was suppose to teach this to a C# programmer who would not need to run this for 6 more months.
I blame management. They loved outside consultants and treated our own programmers like loading dock employees, write ups when 5 minutes late for work.
They wanted me to take a brand new high end pc with me when I retired. I left it on my desk when I left. They had 6 months to learn it.
Given a choice between the government doing it and a company that has been specializing in this type of thing for decades, why wasn’t this the best choice?
> Given a choice between the government doing it and a company that has been specializing in this type of thing for decades, why wasn’t this the best choice?
Because the only thing they are specialized in is sucking tax money from the government.
> It’s not like HP is some unknown foreign company.
That's their only pedigree. In other countries they also took a lot of money and delivered crap. So people should be aware by now. But with all this corr^W lobby ...
Seems like OP is describing the plight of the average worker who gets left to clean up the mess with no credit at the end of it all and less of the business leaders or executives.
IIRC, the capable workers were the first gone to Galt's Gulch. The story dwells on slightly-less-bad business people running around like decapitated chickens trying to keep things working, but the real heroes just quietly noped out.
> So you have the choice between saving the money of poor people now and reinforcing the suck of the whole system, of letting people suffer for a chance of a redo and purging the ones that lead to this.
What a terribly wrong take. If you really believe the system is that corrupted, and that horrible then you know darn well that the count of people who suffered during this emergency won't make a difference. There will be "accountability" in some form, but it won't be driven by how many people did or didn't get money for food at all.
Whether you help them or not won't change the system. So instead you're saying scores of families should go hungry so you can stay on your high horse and lecture.
That's a good reason. You don't need a moral reason to not choose a paid job. The reason scores of families will go hungry because of the mismanagement, not because one human did not step up to keep the mess rolling.
In analogy, the reason Africa is starving is because of the past and present extreme exploitation from foreign powers, and said foreign powers sponsored corruption and violence. Not because "a heartless middle class salary man did not send his monthly 10 dollars", turning a blind eye to the real causes of the situation.
> Fixing this system will make the difference between real people getting unemployment checks or nothing. That's worth enduring a bad boss for.
In absolute terms, perhaps. But if you are moving away from taking jobs out of self interest, and into doing something for the public good; you might as well check out https://80000hours.org/ and pick the job that has the most positive impact.
My guess is that working any kind of 'normal' programmer job and giving 10% of your income to an effective charity would beat out enduring the bad boss in New Jersey.
You working for them would most likely not push them from failure to success. It would perhaps make success marginally more likely, or perhaps decrease schedule overruns slightly. Or perhaps make not much of a difference at all, depending on how screwed they are.
Of course, the estimate of impact also depends on how you value humans. If you value all humans fairly equally, then Americans who had a job until fairly recently (ie those eligible for unemployment insurance) are already fairly well off compared to the people who benefit from more malaria nets. Especially since their unemployment benefits would merely be delayed, not lost.
Lots of people value those closer to them higher. The most prominent example are family and friends. But valuing compatriots higher is fairly common as well.
Of course, you can give more than 10% of your income as well.
Would do it myself. For free because of the unemployed.
But I see no link to who is hiring, not going thru a head hunter who takes a rake, and states cannot send you a firewall link and password without a month or more of red tape. And rightly so if it is a payroll system, the opening it would make for abuse.
There's a difference between a generic "bad boss" and an incompetent one. An incompetent one will kill the project.
If NJ have failed to maintain legacy systems that good COBOL development needs, almost by definition they have incompetent management (and a Governor who can't even pronounce COBOL). They likely don't have the dev or test boxes that they should have. They will either put up barriers to the other systems the developer needs to understand, or will throw open the doors so that my fixes get broken by everyone else's fixes.
> Fixing this system will make the difference between real people getting unemployment checks or nothing.
Let them get nothing. Then they'll call for the heads of the governors, the very people who couldn't care less about the system until it stopped working. They are the ones who could have made a difference but actively chose not to because it just wasn't that important to them.
A bad boss will absolutely give you burnout, and that lasts for years. You are of course correct, but at the same time I would never ask this of someone and I wouldn't subject myself to it either. Bad bosses will destroy your quality of life in the medium-long term.
They'll change their tune when they won't be able to find anyone. The free market is as honest as it gets. If you underpay and undertreat your employees, they will seek an alternative. It is simple opportunity cost.