I did fake Bayesian math with some plausible numbers, and found that if I started out believing there was a 20% per decade chance of a lab leak pandemic, then if COVID was proven to be a lab leak, I should update to 27.5%, and if COVID was proven not to be a lab leak, I should stay around 19-20%

This is so confusing: why bother doing “fake” math? How does he justify these numbers? Let’s look at the footnote:

Assume that before COVID, you were considering two theories:

  1. Lab Leaks Common: There is a 33% chance of a lab-leak-caused pandemic per decade.
  2. Lab Leaks Rare: There is a 10% chance of a lab-leak-caused pandemic per decade.

And suppose before COVID you were 50-50 about which of these were true. If your first decade of observations includes a lab-leak-caused pandemic, you should update your probability over theories to 76-24, which changes your overall probability of pandemic per decade from 21% to 27.5%.

Oh, he doesn’t, he just made the numbers up! “I don’t have actual evidence to support my claims, so I’ll just make up data and call myself a ‘good Bayesian’ to look smart.” Seriously, how could a reasonable person have been expected to be concerned about lab leaks before COVID? It simply wasn’t something in the public consciousness. This looks like some serious hindsight bias to me.

I don’t entirely accept this argument - I think whether or not it was a lab leak matters in order to convince stupid people, who don’t know how to use probabilities and don’t believe anything can go wrong until it’s gone wrong before. But in a world without stupid people, no, it wouldn’t matter.

Ah, no need to make the numbers make sense, because stupid people wouldn’t understand the argument anyway. Quite literally: “To be fair, you have to have a really high IQ to understand my shitty blog posts. The Bayesian math is is extremely subtle…” And, convince stupid people of what, exactly? He doesn’t say, so what was the point of all the fake probabilities? What a prick.

  • Architeuthis@awful.systems
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 year ago

    Hi, my name is Scott Alexander and here’s why it’s bad rationalism to think that widespread EA wrongdoing should reflect poorly on EA.

    The assertion that having semi-frequent sexual harassment incidents go public is actually an indication of health for a movement since it’s evidence that there’s no systemic coverup going on and besides everyone’s doing it is uh quite something.

    But surely of 1,000 sexual harassment incidents, the movement will fumble at least one of them (and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet). You’re not going to convince me I should update much on one (or two, or maybe even three) harassment incidents, especially when it’s so easy to choose which communities’ dirty laundry to signal boost when every community has a thousand harassers in it.

    • titotal@awful.systems
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 year ago

      ahh, I fucking haaaate this line of reasoning. Basically saying “If we’re no worse than average, therefore there’s no problem”, followed by some discussion of “base rates” of harrassment or whatever.

      Except that the average rate of harrassment and abuse, in pretty much every large group, is unacceptably high unless you take active steps to prevent it. You know what’s not a good way to prevent it? Downplaying reports of harrassment and calling the people bringing attention to it biased liars, and explicitly trying to avoid kicking out harmful characters.

      Nothing like a so-called “effective altruist” crowing about having a C- passing grade on the sexual harrassment test.

    • self@awful.systemsM
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      and often the fact that you hear about it at all means the movement is fumbling it less than other movements that would keep it quiet

      I just can’t get over how far this is from reality. like fuck, for a lot of these things the controversy is the community covering for the abuser, or evidence coming out that sexual harassment was covered up in the past. depressingly often in tech, the community doesn’t even try to keep it quiet; instead they just loudly endorse the abuser or talk about how there’s nothing they can do.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      Scott: “Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries. What can I do to help? I know, I’ll bring up 9/11”

      Empty room: “…”

      “And I’ll throw out some made up statistics about terrorist attacks and how statistically we were due for a 9/11 and we overreacted by having any response whatsoever. And then I’ll show how that’s the same as when someone big in EA does something bad.”

      “…”

      “Especially since it’s common for people to, after a big scandal, try and push their agenda to improve things. We definitely don’t want that.”

      “…”

      “Also, on average there’s less SA in STEM, and even though there is still plenty of SA, we don’t need to change anything, because averages.”

      “…”

      “Anyway, time for dexy no. 5”

      • hirudiniformes@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Hmm, the reputation of the EA community that I am part of and love for some reason is tanking, due to the bad actions of its luminaries.

        “And it would be clear I’m full of shit if I put this at the start of the article, so I’ll bury the lede behind a wall of text”