> Too often it descends into methodology theatre and too rarely does it result in useful dialogue.
That is what happens when you present weak evidence.
I have yet to see any convincing studies despite it being straightforward to test in small repeatable experiments.
It is not hard to get 20 different 5 person teams to perform a complex task for a couple of weeks and see what impacts different types of diversity have.
[Just to address the other points you raise because I was in a rush last time]
> I have yet to see any convincing studies despite it being straightforward to test in small repeatable experiments.
If these experiments are so "straightforward", and your resistance to these ideas is well-founded, then where are all the experiments demonstrating that there is no such effect?
Alternatively if you believe publishing bias and/or feminist conspiracy are combining to quash all this budding anti-diversity research then why don't you run one yourself and show everyone how it's done?
If you're not up for doing the actual experiment I'm happy to pass your protocol around for some feedback, maybe I could find someone to run the experiment for you. Maybe I can even help with funding.
Sure, publishing bias exists, as does confirmation bias (on both sides of any debate). Those biases do have an effect, and 'social scientists' do tend to be left-leaning in my experience.
But, those people have done the work, usually had their protocol evaluated, written up their paper, had it reviewed, and had it published (under their real name).
Hacker News critics on the other hand have (typically) not done the work, and have not tried to get their experiment funded and written up, and make sweeping judgements like "methodologically bankrupt" based on unqualified criticisms like "poorly designed" (no specifics or evidence of understanding), usually under an alias with effectively no repurcussions.
You yourself claimed that designing convincing studies without shortcomings is "straightforward", implying that the lack of convincing (in your eyes) evidence is therefore proof that the evidence base can be broadly ignored without actual study. That's rationalising, not logic.
Similarly, when challenged to provide a design yourself you change the subject. My offer is genuine by the way... our investors include some of the most influential organisations in this space. If you have a great experiment design I'm happy to put it to them and give you credit.
> You yourself claimed that designing convincing studies without shortcomings is "straightforward"
Yes because - like I said in my other comment - studies on effective teamwork have been done for decades.
Point me towards a well cited meta-analysis on the impacts of race and gender diversity where the studies are repeatable experiments involving complex problem solving in a small team.
> Similarly, when challenged to provide a design yourself you change the subject.
Because you are not acting in good faith.
We both know designing a proper study would take at least a couple of months of work.
What I have asked from you on the other hand is to provide a link to an existing meta-analyses that supports your claims.
For someone who works in this space and claims there is significant evidence this should be a 5-10 minute exercise.
That is what happens when you present weak evidence.
I have yet to see any convincing studies despite it being straightforward to test in small repeatable experiments.
It is not hard to get 20 different 5 person teams to perform a complex task for a couple of weeks and see what impacts different types of diversity have.