Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Ethical Failures Behind the Boeing Disasters (apaonline.org)
119 points by whack on Dec 26, 2019 | hide | past | favorite | 70 comments


Putting events like this down to ethical failure has the problem that it allows the investigation of the event to stop at a "single point of failure" in the "human system".

The focus on Boeing and elsewhere has revealed that for a substantial period of time the focus of management was essentially "gutting" the company; replacing highly skilled and expensive engineers and workers with cheaper workers elsewhere and generally removing any impediment to immediate profit - this including stretching to 737 spec to the point that it essentially "broke" rather than taking the cost hit of designing a new plane for a new era.

The thing is, when the paradigm the top puts out is "do whatever it take to make those numbers", you already have an implicitly unethical outlook but one with plausible deniability. The people who push that (whether management, hedge funds or board of directors) are careful not to overtly advocate anything illegal or immoral but it seems logical that when X underling who does Y ethical act is caught and punished, that underling will be replaced by another one who will face the same pressures and quite likely also engage in similar overtly unethical behavior.


Target based metrics, be it budget targets or output targets, are too fragile to be useful in my opinion. There is likely a perfect number that's totally reasonable, ensures good productivity while leaving enough room to be safe and grow. But I've yet to see something like that implemented. I've also never seen targets make a company a better place to work, and I would argue that if somewhere is not a good place to work it's inherently less productive and efficient. Management can always compensate for that with more hours or more humans, but you're bloating your system.

Like retail stores with 8 employees on, where some are meant to handle stock and some are meant to serve customers, but you've given them all sales budgets for the day. The stock doesn't get handled, and you have 8 people not meeting their budgets. All the employees are unhappy about that, stressed, and working against eachother to try to succeed.

Competitive metrics just makes it so you have a handful of "star" people, who are great at undermining their colleagues in order to meet their own budget. Rather than an effective team of good people. So in the example, by the metrics, you have 1 good employee and 7 bad ones, no one is happy, your processes have fallen apart and your team doesn't work effectively together. The only people who get fired are the employees, yet the store continues to underperform.


> Target based metrics, be it budget targets or output targets, are too fragile to be useful in my opinion.

The reported target in the 737 MAX case was that Boeing had promised Southwest a $1 million penalty per plane if retraining was required.

So both Boeing and Southwest created a perverse incentive to make the plane less safe. One or two airline accidents can close an airline, so I guarantee nobody at Southwest really wanted to make that deal in retrospect.

Although Southwest didn't have any 737 MAX accidents and they paid for the deluxe MCAS instrument package, they were horrified to find out that what was delivered, what it was supposed to do, and what was documented were 3 different things.

Source: commercially-rated pilot, followed MCAS fiasco daily. Search for "simulators" in the following link:

https://www.forbes.com/sites/petercohan/2019/03/28/did-airbu...


The reported target in the 737 MAX case was that Boeing had promised Southwest a $1 million penalty per plane if retraining was required.

In context that's 1-2% of the sale price of a MAX (depending on discounts). It's plausible that retraining drove MCAS, but that seems unlikely to me.

deluxe MCAS instrument package

No such thing exists. The AoA annunciator was supposed to be active on all 737 MAX planes.


The AoA Disagree Alert was "standard", the indicator was always (stupidly!) extra, and the Alert in fact (accidentally) was only active with the indicator add-on.

https://boeing.mediaroom.com/news-releases-statements?item=1...

Both planes that crashed didn't have the optional add-on, so ended up not having the Alert either.

https://arstechnica.com/information-technology/2019/03/boein...


Great example. Another is that you may have one "star" employee who consistently notices the thing that most needs doing despite incentives, like jumping on the stock problem, and when review time comes around, they're the absolute worst employee on paper. So you fire them, and things get worse, and there's no obvious reason in the analytics as to what went wrong.


Seems to me that at some point they just should say “it’s good enough” and contend with current growth rates. The paradigm of never ending growth pretty is guaranteed to push companies to the point where they will produce inferior goods. Same happened to Deutsche Bank. I still remember when they were doing well the CEO got it into his head to increase their margins even more and declared that 25% was the goal now. There were a lot of people saying that a bank simply has to take on too much risk to produce these numbers and, indeed, the bank blew up. Some industries simply can’t be run like consumer good companies because the consequences of failure are too severe. Obviously this doesn’t stop people from trying repeatedly. With current executive salaries it makes total sense to destroy a company’s base in exchange for a few profitable years.

That’s one of the reasons why I am against more nuclear plants and running them like other businesses . They will start out doing solid work but over time management will get greedy, reduce quality until something blows up.


You see the same thing with wage-theft in large chain stores. Corporate HQ always says that they don't support it, but then they setup sales and payroll goals that can most easily be met with wage-theft.


I believe all of these generally stem from US law making it impossible to charge someone for being an idiot. So provably contradictory claims like "Hit the payroll goals" and "Don't commit wage theft" can be demanded by management all day.

However, actual liability is only accrued by those who act on those policies.

It'd be nice to see liability, once determined at a lower level, forced to follow the organization chart back to the root decision. Possibly balanced for something like direct reports.

E.g. you manage 100 people, 20 of them committed fraud, you have a 20/100 share of the crime


It's even worse because, unless someone is stupid enough to send an e-mail saying "hahaha wage-theft, they'll never catch us," bad-actors can hide behind the veil of idiocy. They might have been promoted up from these stores, know that wage-theft is happening, but continue on knowing that they won't face consequences for it later.


Same for Wells Fargo. Set impossible goals and then be surprised that people are taking shortcuts. I work in medical and even there they squeeze people to the point of reducing quality. At some point you prefer going home on time over doing a thorough review of the code during (unpaid) overtime.



I'm talking about something different from control fraud; particularly in the case that it can arise unintentionally. The fact that it can arise unintentionally makes it much harder to combat because there is built-in plausible deniability.

Example of it occurring unintentionally:

1. A corporation sets goals for store managers based upon performance metrics that include payroll. These goals are currently realizable by a competent manager without forcing workers to work without pay.

2. Some fraction of managers realize rather than actually working hard to meet the goals, they can get their numbers up by forcing workers to work without pay. This skews the metrics a bit.

3. The most incompetent managers that aren't committing wage-theft are demoted or fired, being replaced with managers that are either more-competent or willing to commit wage-theft.

4. The payroll goal is now adjusted down because the previous accounting-term's payroll average is lower than before.

5. Competent managers manage to get jobs elsewhere, so the fraction of managers willing to commit wage-theft increases.

6. GOTO 2

Eventually the metrics are so skewed that the majority of managers must either commit wage-theft or lose their jobs. These managers are the only people that will get in trouble if the wage-theft is discovered.



One person squarely at the center of this scandal is Boeing's Chief Test Pilot. Here's his bio:

> O’Donoghue’s military experience includes 12 years of active duty as a U.S. Marine Corps fighter pilot and test pilot. He flew operational missions in the A-4M, AV-8A and AV-8B Harrier aircraft, and engineering flight tests on the AV-8B and F-14 Tomcat. In 1994, O’Donoghue transferred to the U.S. Air Force Reserve where he flew the C-130, C-141 and C-17. While there he commanded both the 728th Airlift Squadron and the 446 Airlift Wing, stationed at McChord Air Force Base, Wash. In 2005, he retired from the Air Force Reserve at the rank of colonel.

> O’Donoghue holds a bachelor’s degree in mechanical engineering from the U.S. Naval Academy.

(http://www.boeing.com/company/bios/dennis-odonoghue.page)

This doesn't seem to conform to your theory that it's "cheaper workers" and business majors that caused these problems: sure, they may have created even undue pressure. But it was an all-American fighter jock doing a lot of the actual lying about MCAS.


To elaborate on the management, not engineering, failure: nice domestic marquee credentials for the most-visible leader, but... How much credence was his input given among other leaders? How high-quality were his lieutenants and the teams they led? This is the kind of issue that only gets flagged from the deep work of your team.


Ethical failures can be structural, not merely individual. I think the problem you rightly identify is that decision making power and ethical responsibility are delegated separately, allowing a kind of ethical-risk arbitrage. This seems to be a failure of regulation in this particular case, or in general of ethics in management rather than engineering.


Ethical failures can be structural, not merely individual.

At what point does some other word become more appropriate?


Some other word than ethics? When it's not an ethical problem that you're describing.

What you seem to be saying (without actually saying it, so I have to guess) is that, because it's a failure of a group rather than an individual, it isn't ethical any more - it's something else. But if the group structure pushes individuals into unethical actions, isn't the group structure an ethical problem?

Or, for the snarky answer: If companies are corporate persons, then of course they can have ethical issues.


In a high-stakes situation fundamentally depending on innovation to provide fair value with outstanding safety, you can not drop the ball or ethics alone will not help you.

You're going to need overall integrity in addition to ethics, from bottom to top, starting with elements that are known to be lost, or recognized as not being adequate, before you have a chance to make it to the goal again.


I can't see how these accidents point to cheap incompetent labour unless you are saying that the product managers, chief engineers, principal engineers are being outsourced.

Are you saying that a plane that tends to tilt up dangerously under certain conditions that is designed to be pushed down nose first by a software with no limit to how much it'll push down depending on a single sensor would have been an acceptable solution if the workers were different? This is a problem with the design and definition of a large system rather than the implementation.

So, I think in this case, whoever had any knowledge of this at any level and let it go through are absolutely responsible. While outsourcing to cheap labour elsewhere can and often does become a problem,these crashes can't be attributed to that. This was a very specific failure with very specific people in the wrong.


Are you saying that a plane that tends to tilt up dangerously under certain conditions that is designed to be pushed down nose first by a software with no limit to how much it'll push down depending on a single sensor would have been an acceptable solution if the workers were different?

With different engineers and a focus on product versus profit the MAX never would've seen the light of day. With the focus on outsourcing Boeing had a gigantic mess on their hands with the 787 (and 777X) and no resources to focus on getting a 737 replacement to market.


That’s a great point but I actually read the article as possibly implying, rather than contradicting, the notion that this is a failure that is bigger than any one engineer.

I read this as: engineering ethics were broken because of a toxic management culture.


One of the largest responsibilities of management is to set the culture of the business. They obviously do not have the skill set do the design, engineering, fabrication, or production. They are people people; they are paid to shake hands, make deals, and cultivate workplace culture. They are solely at fault for a toxic, because of why their jobs exist.


I read this as: engineering ethics were broken but ascribing no reason for it. Thus the engineers were 100% responsible.


This is Capitalism 101, as long as there is no real consequences of ones actions (like polluting, causing deaths...), any Capitalistic company has to make more and more money each year. As the profit universe is not infinite they soon hit the thin smoke wall that is used as ethical barrier and pass through it.

If we start to put CEOs and majority share owners in prison things will start to shift a bit.


What stands out about Boeing's mistakes is how easy it is for laypersons to understand the mistakes. Relying on only one sensor, no redundancy? Anyone who has stood up a service on the cloud will be astounded that Boeing made this mistake. We run 3 or more copies of everything to make sure we can tolerate failures. Computers making an automated action while making it hard for humans to override? Again, astounding mistake (MCAS). Fixing hardware defects using software? Again, unfathomable how Boeing could have made such a mistake. Would anyone here buy a computer that overheats and catches fire unless monitored and controlled using software? That would be very unusual indeed.

There are lots of very smart people at Boeing. If laypersons can recognize these mistakes then there must have been hundreds or thousands of engineers at Boeing who also recognized these mistakes. But they didn't speak up. I see this as evidence of a cultural problem.

Do you work at a company where the boss discourages you from speaking up about potential problems, and expects you to just do what you're told to do? If so your company could be the next Boeing.


"laypersons"

"Anyone who has stood up a service on the cloud"

I don't think I'd call those the same thing.

.

Aside from that, having a single point of failure in some component isn't necessarily unconscionable, you just make sure the system as a whole can handle the loss of the affected component.


Well, anyone who has stood up a service on the cloud is probably a layperson with respect to designing airplanes...


> Fixing hardware defects using software

That is a thing. It's not ideal, but it's not the end of the world provided it works. E.g., the F-117 cannot be flown without a computer constantly adjusting flight surfaces -- the plane isn't stable. The other decisions are definitely intolerable.

The whole thing smacks of corner-cutting in the worst way.


On the other hand, the 737 MAX can be flown relatively safely with the computers turned off.


(Mostly) reposting from a comment on an article that very few people saw:

Reuters had an article at https://www.reuters.com/article/us-boeing-737max/new-boeing-.... This article contained a paragraph that was news to me:

> Boeing had earlier turned over the documents to the Justice Department, which has an active criminal investigation underway into matters related to the 737 MAX plane.

An active criminal investigation? Wow. Boeing's in a lot hotter water than I knew.


During my civil engineering studies there was an emphasis on ethics throughout classes, even the more specialized, technical ones. This is one huge difference I've noticed compared to the software engineering field. There aren't these governing bodies such as ASCE or NSPE with these ethics canons. I suspect we'll see that change in the next several years.


The reason the engineers are given ethical training is because engineers (like accountants, appraisers or similar people) have historically had an institutional independence from their employers. A civil engineer in particular can choose and will not to sign off on a given project that they know will be unsafe. Given that engineers are certified by their profession, their employment situation is not precarious and so they can reasonable make this refusal (this is degrading like all professions naturally but still).

Programmers have no such institutional support. If a programmer refuses a job, it goes to someone else that's it. Programmers may have ethics but ethical training a la engineers isn't going to give them any leverage for choices.


Exactly! You seem to get it, the issue is not the lack of knowledge of ethics, but the lack of power. That's one of the many, many reasons why strong unions are so important and in this country we see the effects of not having them.

Without protection for people refusing to do bad things, you create a system when there's always someone desperate, hungry or unethical enough to do things that shouldn't be done.


The reason the engineers are given ethical training is because engineers (like accountants, appraisers or similar people) have historically had an institutional independence from their employers.

Imagine the state of the tech world today if all of the "engineer" programmers at Google, Facebook, etc... practiced at the same ethical level as actual engineers.


Seeing how google is firing people for trying to unionize... I would guess there would be less spyware in ads?


Actual engineers make weapons of war; do you think they would not also make adtech?


This sounds reasonable, but I don't quite understand that last bit. If a civil engineer refuses to sign off on a project, won't the same thing happen? Why won't the job go to the next civil engineer down the line?


It's a collective action problem - the company can't fire the first engineer if they know everyone else they hire would also refuse orders. But each individual engineer has an incentive (not getting fired) to break with the group's strategy.

This collective action problem is solved by coordination, through the means of the licensing body. That body can impose severe penalties (not just firing you from your current job, but from all future jobs) for anyone who betrays the group strategy, so an individual engineer can feel some more safety refusing orders in the knowledge that the whole profession will back them up.

EDIT: In civil engineering, this system is propped up by the state, which requires plans to be signed off by a licensed engineer. The guild functions in this capacity as a subcontractor of the state, taking on a regulatory burden and allowing rather more severe punishments (barring someone from a profession) than would be acceptable from a purely state organ. In software, this could be enforced by similar means for safety-critical applications - the ACM, for example, could be required to license any software engineer, with the understanding that they would revoke licenses for negligence or malfeasance that didn't rise to the level or criminal liability.


While skilled software engineers are still rare enough to be coveted, programmers are a dime a dozen these days, many desperate to get their foot in the door to break into the industry. Licensed engineers, on the other hand, are hard to come by. Professional engineers spend years in school, then years training under a licensed engineer, and then finally take an arduous test with an abysmal passing rate to get their license. They aren't about to risk their entire career to satisfy some asshole manager, because the entire business depends on the engineer, not the other way around. It's a small community and word gets around.


The problem with programmers is rather incompetence than obeying unethical instructions. It’s not a business manager who tells a developper to concatenate a SQL string rather than use a parametrised query, to leave a datastore wide open rather than setting up a permissioning, to store passwords unencrypted or not validate inputs from the client.

Yeah, there is a handful of cases where software developers have been given bad instructions from their management, and perhaps Boeing is one of them. But the real problem is developers being unaware of the most basic good practices.


Balancing that out, of course, is that professional engineers (I'm talking about PEs here, not programmers) can be held liable if they sign off on something they knew, or should have known, to be defective.

They would be held liable even if their boss ordered them to do so.

It's a very different set of incentives than we have in software, but maybe its time we introduce real PEs into software development.


Membership of the ACM has a code of ethics https://www.acm.org/code-of-ethics


I wonder if the MCAS software coders are held to any ethical safety standards by Boeing's engineering department or are they so siloed that it would be impossible.


The programmer is told to implement a spec. They didn't write the spec. If the spec is flawed, do they even have the necessary domain knowledge and experience to identify those flaws? And, if they can identify them, and they speak up and nobody listens, what do they do next? Implement it anyway, or quit and let somebody else implement it instead? Especially if they don't even work for Boeing, but for some subcontractor.


The logic follows that if a malicious company deliberately subcontracts all their development, then they've completely indemnified themselves against all claims. While simultaneously providing no opportunity for subcontractors to bring noticed issues to their attention.

Which seems a pretty bad state of affairs to trust when building things like airplanes.


That is why high risk industries have regulators like the FAA instead of just relying on the fear of liability.

In terms of liability, Boeing can try push it to the subbies and the subbies can try and push it back to Boeing. Both are trying to bamboozle the non-technical lawyers.

The purpose of the FAA is to cut through that crap and enforce actual, effective change through sanctions or otherwise, and they didn’t do that. That’s what fell apart here.


To wit:

In my state, Gambling is legalized. I remember my surprise when a friend who worked in the compliance side of the business actually knew what MD5 was (Back in 2007.) She wasn't a 'technical' person either.

She explained that they actually had to audit the slot machines to make sure that the code running on them had a hash that matched a codebase that had been audited and approved by the state regulatory body.

So, the practice for auditing code by a regulatory body is nothing new. If we do it for money, FFS can we do it when there are actual lives involved?


Someone weighed that the short-term money here was better than long-term. I think that quarterly earnings reports is the wrong optimization for some industries. If only the execs at Boeing could imagine each of their planes carrying suitcases of their Bonuses.


As for the FAA, they appear to be in the throes of regulatory capture on this issue.

https://www.economist.com/business/2019/03/23/regulatory-cap...


It is. that's why strong labor laws protect everyone, not just the workers, and why well run Unions are so fundamental to the sustainability of an economy.


Which is why it is so important to keep safety-critical software in-house. Because then the developers shouldn't be limited in knowledge about their prime domain, software engineering, but also about the domain the software is used for. So real aeronautic engineers need to be part of the software teams. People who would at least have a chance to detect fatal flaws in the specs they are supposed to implement, if they were not part of the spec creation from the beginning.


They were held to the standard of "Get it done, yesterday."

And will likely eventually be held to the standard of "How dare you let this happen?" as they're fired, to demonstrate how seriously Boeing takes safety.


I haven't seen any evidence that the coders were doing anything other than accurately implementing the system design that the control systems engineers dictated.

What ethical safety standards would you have them implement?

I assume that you would agree that the control system specification is the ethical responsibility of the control system design engineers.


I don't implement anything I can't/don't understand.

I'm a generalist, so I make it my business to know a bit of everyone else's business. If I can't look at a spec without seeing issues down the chain that the spec makes no mention of, I end up feeling that it is my duty to make sure to raise the question until I am satisfied with the answer.

I don't always get the most satisfying answer, and I haven't had to put the career on the line by doing so yet; but I'm prepared to do so nevertheless.

I will not be part of the next THERAC-25/MAX fiasco. And if I've learned anything from this decade, it is that engineers as a whole may need to organize against those that would seek to have us do unethical work.

It wouldn't stop the practice, and God help me, I don't want the field locked behind accreditation/licensure...

However, I don't see any other defense or measure that would allow for putting the kebash on bad work. There has to be a price for bad corporate behavior in terms of ruthlessly pursuing performance that can only be met through wink wink nudge nudge style inducement to unethical behavior. At least, no way besides publically outting a company's dirty laundry. That really isn't satisfying though, because that requires a sacrifice of somebody's integrity every time, and no one wants to touch you after that.

I just can't converge to a satisfying middle-ground with the right incentives. Besides maybe anonymous whistleblowing to an appropriate watchdog agency. Even then though, issues are raised in that you are leaving the regulation up to people who feel insecure reporting something when they have everything to lose.

It is a frustrating issue to say the least.


I've worked on systems in 3 categories:

1. Safety

2. Non-Safety

3. "Safety"

For #3 I mean it's "we realize that failure has bigger repercussions than a fail-whale, but we can't afford to do any of the ISO processes that have been proven to work." Sometimes I feel like my only job on those sorts of systems is to bang the "Normalization of deviance is not okay" drum in every meeting.

All failures need to go to the PM and get signed off on, otherwise the PM has a false sense of the actual reliability of the system. If the PM wants to get more budget for safety concerns, they should be able to hand a stack of 100s of pages of papers to whomever controls the purse strings and say "These are the failures in the last N days" If all they can say is "some of my engineers have expressed concerns" then 0 change will happen.


> I assume that you would agree that the control system specification is the ethical responsibility of the control system design engineers.

It is the primary responsibility of the control-system, but there is still also a responsibility with everyone who interacts with that spec to speak up if any flaws are noticed.

One of the big things that tight deadlines do is give tunnel vision to the engineers, so "just implement the spec" becomes the goal and the forest can be missed for the trees.

There were probably dozens of engineers that saw the MCAS specification as part of their duties; here's a few possibilities for what happened:

1. Nobody considered the case of improper MCAS engagement under normal flight conditions; this should clearly qualify the system for "Hazardous" classification under DO-178, which would require redundant AOA sensors.

2. Someone considered this case, but didn't speak up (was very junior, or it was way out their specialty).

3. Someone spoke-up, but was told by the person they spoke to disregarded it for the same reasons as #2, so it never made it to the control-system design team.

4. Someone spoke-up, it made it to the control-system design team, and business pressures caused the concern to not be investigated.

#4 would be significant ethical issues for the control-system design engineers, but I think it to be unlikely compared to the others.

#1 can be indirectly caused by time pressure. The certification process is supposed to slow things down, but there is some indication it did not sufficiently do so in this case.

#2 and #3 show ethical lapses outside the control-system design department, and are not just isolated to the individual in question, a safety culture needs to include cultural norms of speaking up about potential problems even when you think you are wrong.


Seems unlikely given that they were likely contractors: https://www.bloomberg.com/news/articles/2019-06-28/boeing-s-...


You don't need to take an ethics class to see that cutting corners when human lives are involved is a bad idea.


I think I heard somewhere that the entire purpose of Engineering as a whole, is to know which corners are safe to cut.

"Anyone can build a bridge that stays up. Only an engineer can build a bridge that barely stays up."

(I mean, it's obviously exaggerated for effect, but still.)


I took ethics courses as part of comp sci. Definitely left an imprint on me.


I don’t know if ASCE or NSPE are governing bodies as much they are societies. I think the Core of Engineers has way more power there from my experience in Civil.

For Software that would be Homeland, FBI, and even the FAA for governing bodies specifically. But software doesn’t have ethically governing bodies because we have enough laws to cover a lot of mandatory auditing.

For example: HIPAA - Medical Data Protection & Compliance Gramm-Leach-Bliley Act - Financial Data Protection & Compliance FISMA - Federal Data Compliance GDPR - PII Data Compliance PCI DSS - Credit Card Data Compliance

Specific example of use: Code that is deployed that involves PCI data being handled requires a code review.

There are heavy penalties for lost of these things, and I promise there are many companies focusing on this. But it could be very much improved with a proper governing body.


Someone has already said it I think but ethics without a way to stand up and say something is wrong is just doing lip service to ethics. Sure our engineers have ethics but we will get some new ones if anyone complains.

All of this is about making money and nothing else. Each quarter make these numbers at all costs. That is what almost all companies are. Amoral beasties that do whatever is needed to make more money.

If they could have killed 300 more people and still kept the planes in the air, the CEO, wallstreet and anyone who held the stock would not have given a single fork.


No Single Points of Failure applies also to organizations.

Boeing's corporate governance failures became life threatening only because FAA certification process failed. Regulatory capture of FAA was the second failure.

Regulatory capture may be responsible for Boeing's recent problems https://www.economist.com/business/2019/03/23/regulatory-cap...

Barbara Hollingsworth: 'Regulatory capture' explains a lot about FAA's failures https://www.washingtonexaminer.com/barbara-hollingsworth-reg...

> What happens to federal employees who ignore safety warnings, cover up incompetent or even criminal behavior, destroy official documents and mislead members of Congress? At the Federal Aviation Administration (FAA), they get promoted.

> That's the take-away from last week's National Whistleblowers Assembly on Capitol Hill, sponsored by the Government Accountability Project (GAP) and featuring famous NYPD whistleblower Frank Serpico and former FBI agent Coleen Rowley.


Site availability has been intermittent. In case it's having trouble:

https://web.archive.org/web/20190509220235/https://blog.apao...


The article specifically calls out the engineer former CEO. IIRC he was the only engineer in Boeing's C-suite and board.

His egregious behavior doesn't get emphasized enough. It's beyond horrible. To back up what was said in the article, here is something from Reuters on the day after the 2nd crash:

“We are confident in the safety of the 737 MAX and in the work of the men and women who design and build it,” Boeing Chief Executive Officer Dennis Muilenburg told employees in an email seen by Reuters. “Since its certification and entry into service, the MAX family has completed hundreds of thousands of flights safely.” https://www.reuters.com/article/ethiopia-airplane-boeing-ceo...

That asshat CEO made those comments the day after the second crash, after 346 people were dead.


Someone, somewhere, said "let's butcher the design of the 737 and try and patch it up with software".

That person should be in jail.


someone like Pat Shanahan, former defense secretary.

https://www.militarytimes.com/news/your-military/2019/03/13/...


That person would definitely fail FizzBuzz




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: