> I assume that you would agree that the control system specification is the ethical responsibility of the control system design engineers.
It is the primary responsibility of the control-system, but there is still also a responsibility with everyone who interacts with that spec to speak up if any flaws are noticed.
One of the big things that tight deadlines do is give tunnel vision to the engineers, so "just implement the spec" becomes the goal and the forest can be missed for the trees.
There were probably dozens of engineers that saw the MCAS specification as part of their duties; here's a few possibilities for what happened:
1. Nobody considered the case of improper MCAS engagement under normal flight conditions; this should clearly qualify the system for "Hazardous" classification under DO-178, which would require redundant AOA sensors.
2. Someone considered this case, but didn't speak up (was very junior, or it was way out their specialty).
3. Someone spoke-up, but was told by the person they spoke to disregarded it for the same reasons as #2, so it never made it to the control-system design team.
4. Someone spoke-up, it made it to the control-system design team, and business pressures caused the concern to not be investigated.
#4 would be significant ethical issues for the control-system design engineers, but I think it to be unlikely compared to the others.
#1 can be indirectly caused by time pressure. The certification process is supposed to slow things down, but there is some indication it did not sufficiently do so in this case.
#2 and #3 show ethical lapses outside the control-system design department, and are not just isolated to the individual in question, a safety culture needs to include cultural norms of speaking up about potential problems even when you think you are wrong.
It is the primary responsibility of the control-system, but there is still also a responsibility with everyone who interacts with that spec to speak up if any flaws are noticed.
One of the big things that tight deadlines do is give tunnel vision to the engineers, so "just implement the spec" becomes the goal and the forest can be missed for the trees.
There were probably dozens of engineers that saw the MCAS specification as part of their duties; here's a few possibilities for what happened:
1. Nobody considered the case of improper MCAS engagement under normal flight conditions; this should clearly qualify the system for "Hazardous" classification under DO-178, which would require redundant AOA sensors.
2. Someone considered this case, but didn't speak up (was very junior, or it was way out their specialty).
3. Someone spoke-up, but was told by the person they spoke to disregarded it for the same reasons as #2, so it never made it to the control-system design team.
4. Someone spoke-up, it made it to the control-system design team, and business pressures caused the concern to not be investigated.
#4 would be significant ethical issues for the control-system design engineers, but I think it to be unlikely compared to the others.
#1 can be indirectly caused by time pressure. The certification process is supposed to slow things down, but there is some indication it did not sufficiently do so in this case.
#2 and #3 show ethical lapses outside the control-system design department, and are not just isolated to the individual in question, a safety culture needs to include cultural norms of speaking up about potential problems even when you think you are wrong.